Nov 28 16:20:27 localhost kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 28 16:20:27 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 28 16:20:27 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 16:20:27 localhost kernel: BIOS-provided physical RAM map:
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 28 16:20:27 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 28 16:20:27 localhost kernel: NX (Execute Disable) protection: active
Nov 28 16:20:27 localhost kernel: APIC: Static calls initialized
Nov 28 16:20:27 localhost kernel: SMBIOS 2.8 present.
Nov 28 16:20:27 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 28 16:20:27 localhost kernel: Hypervisor detected: KVM
Nov 28 16:20:27 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 28 16:20:27 localhost kernel: kvm-clock: using sched offset of 4345167412 cycles
Nov 28 16:20:27 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 28 16:20:27 localhost kernel: tsc: Detected 2800.000 MHz processor
Nov 28 16:20:27 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 28 16:20:27 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 28 16:20:27 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 28 16:20:27 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 28 16:20:27 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 28 16:20:27 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 28 16:20:27 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 28 16:20:27 localhost kernel: Using GB pages for direct mapping
Nov 28 16:20:27 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 28 16:20:27 localhost kernel: ACPI: Early table checksum verification disabled
Nov 28 16:20:27 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 28 16:20:27 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 16:20:27 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 16:20:27 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 16:20:27 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 28 16:20:27 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 16:20:27 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 16:20:27 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 28 16:20:27 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 28 16:20:27 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 28 16:20:27 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 28 16:20:27 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 28 16:20:27 localhost kernel: No NUMA configuration found
Nov 28 16:20:27 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 28 16:20:27 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 28 16:20:27 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 28 16:20:27 localhost kernel: Zone ranges:
Nov 28 16:20:27 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 28 16:20:27 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 28 16:20:27 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 16:20:27 localhost kernel:   Device   empty
Nov 28 16:20:27 localhost kernel: Movable zone start for each node
Nov 28 16:20:27 localhost kernel: Early memory node ranges
Nov 28 16:20:27 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 28 16:20:27 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 28 16:20:27 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 16:20:27 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 28 16:20:27 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 28 16:20:27 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 28 16:20:27 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 28 16:20:27 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 28 16:20:27 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 28 16:20:27 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 28 16:20:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 28 16:20:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 28 16:20:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 28 16:20:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 28 16:20:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 28 16:20:27 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 28 16:20:27 localhost kernel: TSC deadline timer available
Nov 28 16:20:27 localhost kernel: CPU topo: Max. logical packages:   8
Nov 28 16:20:27 localhost kernel: CPU topo: Max. logical dies:       8
Nov 28 16:20:27 localhost kernel: CPU topo: Max. dies per package:   1
Nov 28 16:20:27 localhost kernel: CPU topo: Max. threads per core:   1
Nov 28 16:20:27 localhost kernel: CPU topo: Num. cores per package:     1
Nov 28 16:20:27 localhost kernel: CPU topo: Num. threads per package:   1
Nov 28 16:20:27 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 28 16:20:27 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 28 16:20:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 28 16:20:27 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 28 16:20:27 localhost kernel: Booting paravirtualized kernel on KVM
Nov 28 16:20:27 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 28 16:20:27 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 28 16:20:27 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 28 16:20:27 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 28 16:20:27 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 28 16:20:27 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 28 16:20:27 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 16:20:27 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 28 16:20:27 localhost kernel: random: crng init done
Nov 28 16:20:27 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 28 16:20:27 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 28 16:20:27 localhost kernel: Fallback order for Node 0: 0 
Nov 28 16:20:27 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 28 16:20:27 localhost kernel: Policy zone: Normal
Nov 28 16:20:27 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 28 16:20:27 localhost kernel: software IO TLB: area num 8.
Nov 28 16:20:27 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 28 16:20:27 localhost kernel: ftrace: allocating 49313 entries in 193 pages
Nov 28 16:20:27 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 28 16:20:27 localhost kernel: Dynamic Preempt: voluntary
Nov 28 16:20:27 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 28 16:20:27 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 28 16:20:27 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 28 16:20:27 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 28 16:20:27 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 28 16:20:27 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 28 16:20:27 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 28 16:20:27 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 28 16:20:27 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 16:20:27 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 16:20:27 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 16:20:27 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 28 16:20:27 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 28 16:20:27 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 28 16:20:27 localhost kernel: Console: colour VGA+ 80x25
Nov 28 16:20:27 localhost kernel: printk: console [ttyS0] enabled
Nov 28 16:20:27 localhost kernel: ACPI: Core revision 20230331
Nov 28 16:20:27 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 28 16:20:27 localhost kernel: x2apic enabled
Nov 28 16:20:27 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 28 16:20:27 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 28 16:20:27 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 28 16:20:27 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 28 16:20:27 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 28 16:20:27 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 28 16:20:27 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 28 16:20:27 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 28 16:20:27 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 28 16:20:27 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 28 16:20:27 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 28 16:20:27 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 28 16:20:27 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 28 16:20:27 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 28 16:20:27 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 28 16:20:27 localhost kernel: x86/bugs: return thunk changed
Nov 28 16:20:27 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 28 16:20:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 28 16:20:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 28 16:20:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 28 16:20:27 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 28 16:20:27 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 28 16:20:27 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 28 16:20:27 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 28 16:20:27 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 28 16:20:27 localhost kernel: landlock: Up and running.
Nov 28 16:20:27 localhost kernel: Yama: becoming mindful.
Nov 28 16:20:27 localhost kernel: SELinux:  Initializing.
Nov 28 16:20:27 localhost kernel: LSM support for eBPF active
Nov 28 16:20:27 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 16:20:27 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 16:20:27 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 28 16:20:27 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 28 16:20:27 localhost kernel: ... version:                0
Nov 28 16:20:27 localhost kernel: ... bit width:              48
Nov 28 16:20:27 localhost kernel: ... generic registers:      6
Nov 28 16:20:27 localhost kernel: ... value mask:             0000ffffffffffff
Nov 28 16:20:27 localhost kernel: ... max period:             00007fffffffffff
Nov 28 16:20:27 localhost kernel: ... fixed-purpose events:   0
Nov 28 16:20:27 localhost kernel: ... event mask:             000000000000003f
Nov 28 16:20:27 localhost kernel: signal: max sigframe size: 1776
Nov 28 16:20:27 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 28 16:20:27 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 28 16:20:27 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 28 16:20:27 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 28 16:20:27 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 28 16:20:27 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 28 16:20:27 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 28 16:20:27 localhost kernel: node 0 deferred pages initialised in 48ms
Nov 28 16:20:27 localhost kernel: Memory: 7765892K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616268K reserved, 0K cma-reserved)
Nov 28 16:20:27 localhost kernel: devtmpfs: initialized
Nov 28 16:20:27 localhost kernel: x86/mm: Memory block size: 128MB
Nov 28 16:20:27 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 28 16:20:27 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 28 16:20:27 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 28 16:20:27 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 28 16:20:27 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 28 16:20:27 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 28 16:20:27 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 28 16:20:27 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 28 16:20:27 localhost kernel: audit: type=2000 audit(1764346824.960:1): state=initialized audit_enabled=0 res=1
Nov 28 16:20:27 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 28 16:20:27 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 28 16:20:27 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 28 16:20:27 localhost kernel: cpuidle: using governor menu
Nov 28 16:20:27 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 28 16:20:27 localhost kernel: PCI: Using configuration type 1 for base access
Nov 28 16:20:27 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 28 16:20:27 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 28 16:20:27 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 28 16:20:27 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 28 16:20:27 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 28 16:20:27 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 28 16:20:27 localhost kernel: Demotion targets for Node 0: null
Nov 28 16:20:27 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 28 16:20:27 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 28 16:20:27 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 28 16:20:27 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 28 16:20:27 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 28 16:20:27 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 28 16:20:27 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 28 16:20:27 localhost kernel: ACPI: Interpreter enabled
Nov 28 16:20:27 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 28 16:20:27 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 28 16:20:27 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 28 16:20:27 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 28 16:20:27 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 28 16:20:27 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 28 16:20:27 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [3] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [4] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [5] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [6] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [7] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [8] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [9] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [10] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [11] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [12] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [13] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [14] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [15] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [16] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [17] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [18] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [19] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [20] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [21] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [22] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [23] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [24] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [25] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [26] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [27] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [28] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [29] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [30] registered
Nov 28 16:20:27 localhost kernel: acpiphp: Slot [31] registered
Nov 28 16:20:27 localhost kernel: PCI host bridge to bus 0000:00
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 28 16:20:27 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 28 16:20:27 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 28 16:20:27 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 16:20:27 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 28 16:20:27 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 28 16:20:27 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 28 16:20:27 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 28 16:20:27 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 28 16:20:27 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 28 16:20:27 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 28 16:20:27 localhost kernel: iommu: Default domain type: Translated
Nov 28 16:20:27 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 28 16:20:27 localhost kernel: SCSI subsystem initialized
Nov 28 16:20:27 localhost kernel: ACPI: bus type USB registered
Nov 28 16:20:27 localhost kernel: usbcore: registered new interface driver usbfs
Nov 28 16:20:27 localhost kernel: usbcore: registered new interface driver hub
Nov 28 16:20:27 localhost kernel: usbcore: registered new device driver usb
Nov 28 16:20:27 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 28 16:20:27 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 28 16:20:27 localhost kernel: PTP clock support registered
Nov 28 16:20:27 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 28 16:20:27 localhost kernel: NetLabel: Initializing
Nov 28 16:20:27 localhost kernel: NetLabel:  domain hash size = 128
Nov 28 16:20:27 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 28 16:20:27 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 28 16:20:27 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 28 16:20:27 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 28 16:20:27 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 28 16:20:27 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 28 16:20:27 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 28 16:20:27 localhost kernel: vgaarb: loaded
Nov 28 16:20:27 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 28 16:20:27 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 28 16:20:27 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 28 16:20:27 localhost kernel: pnp: PnP ACPI init
Nov 28 16:20:27 localhost kernel: pnp 00:03: [dma 2]
Nov 28 16:20:27 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 28 16:20:27 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 28 16:20:27 localhost kernel: NET: Registered PF_INET protocol family
Nov 28 16:20:27 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 28 16:20:27 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 28 16:20:27 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 28 16:20:27 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 28 16:20:27 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 28 16:20:27 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 28 16:20:27 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 28 16:20:27 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 16:20:27 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 16:20:27 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 28 16:20:27 localhost kernel: NET: Registered PF_XDP protocol family
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 28 16:20:27 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 28 16:20:27 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 28 16:20:27 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 28 16:20:27 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72190 usecs
Nov 28 16:20:27 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 28 16:20:27 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 28 16:20:27 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 28 16:20:27 localhost kernel: ACPI: bus type thunderbolt registered
Nov 28 16:20:27 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 28 16:20:27 localhost kernel: Initialise system trusted keyrings
Nov 28 16:20:27 localhost kernel: Key type blacklist registered
Nov 28 16:20:27 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 28 16:20:27 localhost kernel: zbud: loaded
Nov 28 16:20:27 localhost kernel: integrity: Platform Keyring initialized
Nov 28 16:20:27 localhost kernel: integrity: Machine keyring initialized
Nov 28 16:20:27 localhost kernel: Freeing initrd memory: 85868K
Nov 28 16:20:27 localhost kernel: NET: Registered PF_ALG protocol family
Nov 28 16:20:27 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 28 16:20:27 localhost kernel: Key type asymmetric registered
Nov 28 16:20:27 localhost kernel: Asymmetric key parser 'x509' registered
Nov 28 16:20:27 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 28 16:20:27 localhost kernel: io scheduler mq-deadline registered
Nov 28 16:20:27 localhost kernel: io scheduler kyber registered
Nov 28 16:20:27 localhost kernel: io scheduler bfq registered
Nov 28 16:20:27 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 28 16:20:27 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 28 16:20:27 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 28 16:20:27 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 28 16:20:27 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 28 16:20:27 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 28 16:20:27 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 28 16:20:27 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 28 16:20:27 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 28 16:20:27 localhost kernel: Non-volatile memory driver v1.3
Nov 28 16:20:27 localhost kernel: rdac: device handler registered
Nov 28 16:20:27 localhost kernel: hp_sw: device handler registered
Nov 28 16:20:27 localhost kernel: emc: device handler registered
Nov 28 16:20:27 localhost kernel: alua: device handler registered
Nov 28 16:20:27 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 28 16:20:27 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 28 16:20:27 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 28 16:20:27 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 28 16:20:27 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 28 16:20:27 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 28 16:20:27 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 28 16:20:27 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 28 16:20:27 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 28 16:20:27 localhost kernel: hub 1-0:1.0: USB hub found
Nov 28 16:20:27 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 28 16:20:27 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 28 16:20:27 localhost kernel: usbserial: USB Serial support registered for generic
Nov 28 16:20:27 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 28 16:20:27 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 28 16:20:27 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 28 16:20:27 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 28 16:20:27 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 28 16:20:27 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 28 16:20:27 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 28 16:20:27 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T16:20:26 UTC (1764346826)
Nov 28 16:20:27 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 28 16:20:27 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 28 16:20:27 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 28 16:20:27 localhost kernel: usbcore: registered new interface driver usbhid
Nov 28 16:20:27 localhost kernel: usbhid: USB HID core driver
Nov 28 16:20:27 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 28 16:20:27 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 28 16:20:27 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 28 16:20:27 localhost kernel: Initializing XFRM netlink socket
Nov 28 16:20:27 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 28 16:20:27 localhost kernel: Segment Routing with IPv6
Nov 28 16:20:27 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 28 16:20:27 localhost kernel: mpls_gso: MPLS GSO support
Nov 28 16:20:27 localhost kernel: IPI shorthand broadcast: enabled
Nov 28 16:20:27 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 28 16:20:27 localhost kernel: AES CTR mode by8 optimization enabled
Nov 28 16:20:27 localhost kernel: sched_clock: Marking stable (2710008139, 173367850)->(3269966099, -386590110)
Nov 28 16:20:27 localhost kernel: registered taskstats version 1
Nov 28 16:20:27 localhost kernel: Loading compiled-in X.509 certificates
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 28 16:20:27 localhost kernel: Demotion targets for Node 0: null
Nov 28 16:20:27 localhost kernel: page_owner is disabled
Nov 28 16:20:27 localhost kernel: Key type .fscrypt registered
Nov 28 16:20:27 localhost kernel: Key type fscrypt-provisioning registered
Nov 28 16:20:27 localhost kernel: Key type big_key registered
Nov 28 16:20:27 localhost kernel: Key type encrypted registered
Nov 28 16:20:27 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 28 16:20:27 localhost kernel: Loading compiled-in module X.509 certificates
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 16:20:27 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 28 16:20:27 localhost kernel: ima: No architecture policies found
Nov 28 16:20:27 localhost kernel: evm: Initialising EVM extended attributes:
Nov 28 16:20:27 localhost kernel: evm: security.selinux
Nov 28 16:20:27 localhost kernel: evm: security.SMACK64 (disabled)
Nov 28 16:20:27 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 28 16:20:27 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 28 16:20:27 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 28 16:20:27 localhost kernel: evm: security.apparmor (disabled)
Nov 28 16:20:27 localhost kernel: evm: security.ima
Nov 28 16:20:27 localhost kernel: evm: security.capability
Nov 28 16:20:27 localhost kernel: evm: HMAC attrs: 0x1
Nov 28 16:20:27 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 28 16:20:27 localhost kernel: Running certificate verification RSA selftest
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 28 16:20:27 localhost kernel: Running certificate verification ECDSA selftest
Nov 28 16:20:27 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 28 16:20:27 localhost kernel: clk: Disabling unused clocks
Nov 28 16:20:27 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 28 16:20:27 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 28 16:20:27 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 28 16:20:27 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 28 16:20:27 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 28 16:20:27 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 28 16:20:27 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 28 16:20:27 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 28 16:20:27 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 28 16:20:27 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 28 16:20:27 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 28 16:20:27 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 28 16:20:27 localhost kernel: Run /init as init process
Nov 28 16:20:27 localhost kernel:   with arguments:
Nov 28 16:20:27 localhost kernel:     /init
Nov 28 16:20:27 localhost kernel:   with environment:
Nov 28 16:20:27 localhost kernel:     HOME=/
Nov 28 16:20:27 localhost kernel:     TERM=linux
Nov 28 16:20:27 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64
Nov 28 16:20:27 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 16:20:27 localhost systemd[1]: Detected virtualization kvm.
Nov 28 16:20:27 localhost systemd[1]: Detected architecture x86-64.
Nov 28 16:20:27 localhost systemd[1]: Running in initrd.
Nov 28 16:20:27 localhost systemd[1]: No hostname configured, using default hostname.
Nov 28 16:20:27 localhost systemd[1]: Hostname set to <localhost>.
Nov 28 16:20:27 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 28 16:20:27 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 28 16:20:27 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 16:20:27 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 16:20:27 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 28 16:20:27 localhost systemd[1]: Reached target Local File Systems.
Nov 28 16:20:27 localhost systemd[1]: Reached target Path Units.
Nov 28 16:20:27 localhost systemd[1]: Reached target Slice Units.
Nov 28 16:20:27 localhost systemd[1]: Reached target Swaps.
Nov 28 16:20:27 localhost systemd[1]: Reached target Timer Units.
Nov 28 16:20:27 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 16:20:27 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 28 16:20:27 localhost systemd[1]: Listening on Journal Socket.
Nov 28 16:20:27 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 16:20:27 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 16:20:27 localhost systemd[1]: Reached target Socket Units.
Nov 28 16:20:27 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 16:20:27 localhost systemd[1]: Starting Journal Service...
Nov 28 16:20:27 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 16:20:27 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 16:20:27 localhost systemd[1]: Starting Create System Users...
Nov 28 16:20:27 localhost systemd[1]: Starting Setup Virtual Console...
Nov 28 16:20:27 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 16:20:27 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 16:20:27 localhost systemd[1]: Finished Create System Users.
Nov 28 16:20:27 localhost systemd-journald[307]: Journal started
Nov 28 16:20:27 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/ea074f679ae14549ac73672b8df6afe1) is 8.0M, max 153.6M, 145.6M free.
Nov 28 16:20:27 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 28 16:20:27 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 28 16:20:27 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 28 16:20:27 localhost systemd[1]: Started Journal Service.
Nov 28 16:20:27 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 16:20:27 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 16:20:27 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 16:20:27 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 16:20:27 localhost systemd[1]: Finished Setup Virtual Console.
Nov 28 16:20:27 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 28 16:20:27 localhost systemd[1]: Starting dracut cmdline hook...
Nov 28 16:20:27 localhost dracut-cmdline[329]: dracut-9 dracut-057-102.git20250818.el9
Nov 28 16:20:27 localhost dracut-cmdline[329]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 16:20:27 localhost systemd[1]: Finished dracut cmdline hook.
Nov 28 16:20:27 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 28 16:20:27 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 28 16:20:27 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 28 16:20:27 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 28 16:20:27 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 28 16:20:27 localhost kernel: RPC: Registered udp transport module.
Nov 28 16:20:27 localhost kernel: RPC: Registered tcp transport module.
Nov 28 16:20:27 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 28 16:20:27 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 28 16:20:27 localhost rpc.statd[446]: Version 2.5.4 starting
Nov 28 16:20:27 localhost rpc.statd[446]: Initializing NSM state
Nov 28 16:20:27 localhost rpc.idmapd[451]: Setting log level to 0
Nov 28 16:20:27 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 28 16:20:27 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 16:20:27 localhost systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 16:20:28 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 16:20:28 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 28 16:20:28 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 28 16:20:28 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 16:20:28 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 28 16:20:28 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 16:20:28 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 16:20:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 16:20:28 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 16:20:28 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 16:20:28 localhost systemd[1]: Reached target Network.
Nov 28 16:20:28 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 16:20:28 localhost systemd[1]: Starting dracut initqueue hook...
Nov 28 16:20:28 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 28 16:20:28 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 28 16:20:28 localhost systemd[1]: Reached target System Initialization.
Nov 28 16:20:28 localhost systemd[1]: Reached target Basic System.
Nov 28 16:20:28 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 28 16:20:28 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 28 16:20:28 localhost kernel:  vda: vda1
Nov 28 16:20:28 localhost systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 16:20:28 localhost kernel: libata version 3.00 loaded.
Nov 28 16:20:28 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 28 16:20:28 localhost systemd-udevd[503]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 16:20:28 localhost kernel: scsi host0: ata_piix
Nov 28 16:20:28 localhost kernel: scsi host1: ata_piix
Nov 28 16:20:28 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 28 16:20:28 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 28 16:20:28 localhost systemd[1]: Reached target Initrd Root Device.
Nov 28 16:20:28 localhost kernel: ata1: found unknown device (class 0)
Nov 28 16:20:28 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 28 16:20:28 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 28 16:20:28 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 28 16:20:28 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 28 16:20:28 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 28 16:20:28 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 28 16:20:28 localhost systemd[1]: Finished dracut initqueue hook.
Nov 28 16:20:28 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 16:20:28 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 28 16:20:28 localhost systemd[1]: Reached target Remote File Systems.
Nov 28 16:20:28 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 28 16:20:28 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 28 16:20:28 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 28 16:20:28 localhost systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Nov 28 16:20:28 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 16:20:28 localhost systemd[1]: Mounting /sysroot...
Nov 28 16:20:29 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 28 16:20:29 localhost kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 28 16:20:29 localhost kernel: XFS (vda1): Ending clean mount
Nov 28 16:20:29 localhost systemd[1]: Mounted /sysroot.
Nov 28 16:20:29 localhost systemd[1]: Reached target Initrd Root File System.
Nov 28 16:20:29 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 28 16:20:29 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 28 16:20:29 localhost systemd[1]: Reached target Initrd File Systems.
Nov 28 16:20:29 localhost systemd[1]: Reached target Initrd Default Target.
Nov 28 16:20:29 localhost systemd[1]: Starting dracut mount hook...
Nov 28 16:20:29 localhost systemd[1]: Finished dracut mount hook.
Nov 28 16:20:29 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 28 16:20:29 localhost rpc.idmapd[451]: exiting on signal 15
Nov 28 16:20:29 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 28 16:20:29 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 28 16:20:29 localhost systemd[1]: Stopped target Network.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Timer Units.
Nov 28 16:20:29 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 28 16:20:29 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Basic System.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Path Units.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Remote File Systems.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Slice Units.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Socket Units.
Nov 28 16:20:29 localhost systemd[1]: Stopped target System Initialization.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Local File Systems.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Swaps.
Nov 28 16:20:29 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut mount hook.
Nov 28 16:20:29 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 28 16:20:29 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 28 16:20:29 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 28 16:20:29 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 28 16:20:29 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 28 16:20:29 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 28 16:20:29 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 28 16:20:29 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 28 16:20:29 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 16:20:29 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 28 16:20:29 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 28 16:20:29 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 16:20:29 localhost systemd[1]: systemd-udevd.service: Consumed 1.399s CPU time.
Nov 28 16:20:29 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Closed udev Control Socket.
Nov 28 16:20:29 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Closed udev Kernel Socket.
Nov 28 16:20:29 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 28 16:20:29 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 28 16:20:29 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 28 16:20:30 localhost systemd[1]: Starting Cleanup udev Database...
Nov 28 16:20:30 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 28 16:20:30 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 28 16:20:30 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Stopped Create System Users.
Nov 28 16:20:30 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Finished Cleanup udev Database.
Nov 28 16:20:30 localhost systemd[1]: Reached target Switch Root.
Nov 28 16:20:30 localhost systemd[1]: Starting Switch Root...
Nov 28 16:20:30 localhost systemd[1]: Switching root.
Nov 28 16:20:30 localhost systemd-journald[307]: Journal stopped
Nov 28 16:20:30 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Nov 28 16:20:30 localhost kernel: audit: type=1404 audit(1764346830.159:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability open_perms=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 16:20:30 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 16:20:30 localhost kernel: audit: type=1403 audit(1764346830.316:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 28 16:20:30 localhost systemd[1]: Successfully loaded SELinux policy in 160.545ms.
Nov 28 16:20:30 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 43.276ms.
Nov 28 16:20:30 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 16:20:30 localhost systemd[1]: Detected virtualization kvm.
Nov 28 16:20:30 localhost systemd[1]: Detected architecture x86-64.
Nov 28 16:20:30 localhost systemd-rc-local-generator[640]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 16:20:30 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Stopped Switch Root.
Nov 28 16:20:30 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 28 16:20:30 localhost systemd[1]: Created slice Slice /system/getty.
Nov 28 16:20:30 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 28 16:20:30 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 28 16:20:30 localhost systemd[1]: Created slice User and Session Slice.
Nov 28 16:20:30 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 16:20:30 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 28 16:20:30 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 28 16:20:30 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 28 16:20:30 localhost systemd[1]: Stopped target Switch Root.
Nov 28 16:20:30 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 28 16:20:30 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 28 16:20:30 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 28 16:20:30 localhost systemd[1]: Reached target Path Units.
Nov 28 16:20:30 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 28 16:20:30 localhost systemd[1]: Reached target Slice Units.
Nov 28 16:20:30 localhost systemd[1]: Reached target Swaps.
Nov 28 16:20:30 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 28 16:20:30 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 28 16:20:30 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 28 16:20:30 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 28 16:20:30 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 28 16:20:30 localhost systemd[1]: Listening on udev Control Socket.
Nov 28 16:20:30 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 28 16:20:30 localhost systemd[1]: Mounting Huge Pages File System...
Nov 28 16:20:30 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 28 16:20:30 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 28 16:20:30 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 28 16:20:30 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 16:20:30 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 28 16:20:30 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 16:20:30 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 28 16:20:30 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 28 16:20:30 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 28 16:20:30 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 28 16:20:30 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 28 16:20:30 localhost systemd[1]: Stopped Journal Service.
Nov 28 16:20:30 localhost systemd[1]: Starting Journal Service...
Nov 28 16:20:30 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 16:20:30 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 28 16:20:30 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 16:20:30 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 28 16:20:30 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 28 16:20:30 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 28 16:20:30 localhost kernel: fuse: init (API version 7.37)
Nov 28 16:20:30 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 28 16:20:30 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 28 16:20:30 localhost systemd[1]: Mounted Huge Pages File System.
Nov 28 16:20:30 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 28 16:20:30 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 28 16:20:30 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 28 16:20:30 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 16:20:30 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 16:20:30 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 28 16:20:30 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 28 16:20:30 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 28 16:20:30 localhost systemd-journald[681]: Journal started
Nov 28 16:20:30 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 16:20:30 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 28 16:20:30 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 16:20:30 localhost systemd[1]: Started Journal Service.
Nov 28 16:20:30 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 28 16:20:30 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 28 16:20:30 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 28 16:20:30 localhost systemd[1]: Mounting FUSE Control File System...
Nov 28 16:20:30 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 16:20:30 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 28 16:20:30 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 28 16:20:30 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 28 16:20:30 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 28 16:20:31 localhost systemd[1]: Starting Create System Users...
Nov 28 16:20:31 localhost kernel: ACPI: bus type drm_connector registered
Nov 28 16:20:31 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 28 16:20:31 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 28 16:20:31 localhost systemd[1]: Mounted FUSE Control File System.
Nov 28 16:20:31 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 16:20:31 localhost systemd-journald[681]: Received client request to flush runtime journal.
Nov 28 16:20:31 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 28 16:20:31 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 28 16:20:31 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 28 16:20:31 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 16:20:31 localhost systemd[1]: Finished Create System Users.
Nov 28 16:20:31 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 16:20:31 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 16:20:31 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 28 16:20:31 localhost systemd[1]: Reached target Local File Systems.
Nov 28 16:20:31 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 28 16:20:31 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 28 16:20:31 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 28 16:20:31 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 28 16:20:31 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 28 16:20:31 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 28 16:20:31 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 16:20:31 localhost bootctl[698]: Couldn't find EFI system partition, skipping.
Nov 28 16:20:31 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 28 16:20:31 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 16:20:31 localhost systemd[1]: Starting Security Auditing Service...
Nov 28 16:20:31 localhost systemd[1]: Starting RPC Bind...
Nov 28 16:20:31 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 28 16:20:31 localhost auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 28 16:20:31 localhost auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 28 16:20:31 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 28 16:20:31 localhost systemd[1]: Started RPC Bind.
Nov 28 16:20:31 localhost augenrules[709]: /sbin/augenrules: No change
Nov 28 16:20:31 localhost augenrules[724]: No rules
Nov 28 16:20:31 localhost augenrules[724]: enabled 1
Nov 28 16:20:31 localhost augenrules[724]: failure 1
Nov 28 16:20:31 localhost augenrules[724]: pid 704
Nov 28 16:20:31 localhost augenrules[724]: rate_limit 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_limit 8192
Nov 28 16:20:31 localhost augenrules[724]: lost 0
Nov 28 16:20:31 localhost augenrules[724]: backlog 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time 60000
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 28 16:20:31 localhost augenrules[724]: enabled 1
Nov 28 16:20:31 localhost augenrules[724]: failure 1
Nov 28 16:20:31 localhost augenrules[724]: pid 704
Nov 28 16:20:31 localhost augenrules[724]: rate_limit 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_limit 8192
Nov 28 16:20:31 localhost augenrules[724]: lost 0
Nov 28 16:20:31 localhost augenrules[724]: backlog 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time 60000
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 28 16:20:31 localhost augenrules[724]: enabled 1
Nov 28 16:20:31 localhost augenrules[724]: failure 1
Nov 28 16:20:31 localhost augenrules[724]: pid 704
Nov 28 16:20:31 localhost augenrules[724]: rate_limit 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_limit 8192
Nov 28 16:20:31 localhost augenrules[724]: lost 0
Nov 28 16:20:31 localhost augenrules[724]: backlog 0
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time 60000
Nov 28 16:20:31 localhost augenrules[724]: backlog_wait_time_actual 0
Nov 28 16:20:31 localhost systemd[1]: Started Security Auditing Service.
Nov 28 16:20:31 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 28 16:20:31 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 28 16:20:31 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 28 16:20:32 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 28 16:20:32 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 16:20:32 localhost systemd[1]: Starting Update is Completed...
Nov 28 16:20:32 localhost systemd[1]: Finished Update is Completed.
Nov 28 16:20:32 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 16:20:32 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 16:20:32 localhost systemd[1]: Reached target System Initialization.
Nov 28 16:20:32 localhost systemd[1]: Started dnf makecache --timer.
Nov 28 16:20:32 localhost systemd[1]: Started Daily rotation of log files.
Nov 28 16:20:32 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 28 16:20:32 localhost systemd[1]: Reached target Timer Units.
Nov 28 16:20:32 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 16:20:32 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 28 16:20:32 localhost systemd[1]: Reached target Socket Units.
Nov 28 16:20:32 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 28 16:20:32 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 16:20:32 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 28 16:20:32 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 28 16:20:32 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 16:20:32 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 28 16:20:32 localhost systemd-udevd[740]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 16:20:32 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 28 16:20:32 localhost systemd[1]: Reached target Basic System.
Nov 28 16:20:32 localhost dbus-broker-lau[758]: Ready
Nov 28 16:20:32 localhost systemd[1]: Starting NTP client/server...
Nov 28 16:20:32 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 28 16:20:32 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 28 16:20:32 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 28 16:20:32 localhost systemd[1]: Started irqbalance daemon.
Nov 28 16:20:32 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 28 16:20:32 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 16:20:32 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 16:20:32 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 16:20:32 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 28 16:20:32 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 28 16:20:32 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 28 16:20:32 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 28 16:20:32 localhost chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 16:20:32 localhost chronyd[792]: Loaded 0 symmetric keys
Nov 28 16:20:32 localhost systemd[1]: Starting User Login Management...
Nov 28 16:20:32 localhost chronyd[792]: Using right/UTC timezone to obtain leap second data
Nov 28 16:20:32 localhost chronyd[792]: Loaded seccomp filter (level 2)
Nov 28 16:20:32 localhost systemd[1]: Started NTP client/server.
Nov 28 16:20:32 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 28 16:20:32 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 28 16:20:32 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 28 16:20:32 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 28 16:20:32 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 28 16:20:32 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 28 16:20:32 localhost systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 16:20:32 localhost systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 16:20:32 localhost systemd-logind[788]: New seat seat0.
Nov 28 16:20:32 localhost systemd[1]: Started User Login Management.
Nov 28 16:20:33 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 28 16:20:33 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 28 16:20:33 localhost kernel: Console: switching to colour dummy device 80x25
Nov 28 16:20:33 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 28 16:20:33 localhost kernel: [drm] features: -context_init
Nov 28 16:20:33 localhost kernel: [drm] number of scanouts: 1
Nov 28 16:20:33 localhost kernel: [drm] number of cap sets: 0
Nov 28 16:20:33 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 28 16:20:33 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 28 16:20:33 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 28 16:20:33 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 28 16:20:33 localhost kernel: kvm_amd: TSC scaling supported
Nov 28 16:20:33 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 28 16:20:33 localhost kernel: kvm_amd: Nested Paging enabled
Nov 28 16:20:33 localhost kernel: kvm_amd: LBR virtualization supported
Nov 28 16:20:33 localhost iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Nov 28 16:20:33 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 28 16:20:33 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 28 Nov 2025 16:20:33 +0000. Up 9.73 seconds.
Nov 28 16:20:33 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 28 16:20:33 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 28 16:20:33 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpe2u4_pkk.mount: Deactivated successfully.
Nov 28 16:20:33 localhost systemd[1]: Starting Hostname Service...
Nov 28 16:20:33 localhost systemd[1]: Started Hostname Service.
Nov 28 16:20:33 np0005539040.novalocal systemd-hostnamed[854]: Hostname set to <np0005539040.novalocal> (static)
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Reached target Preparation for Network.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Starting Network Manager...
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0866] NetworkManager (version 1.54.1-1.el9) is starting... (boot:554a2c9a-1114-4202-a0a9-67957e011662)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0872] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0942] manager[0x559ac267b080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0977] hostname: hostname: using hostnamed
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0977] hostname: static hostname changed from (none) to "np0005539040.novalocal"
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.0981] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1129] manager[0x559ac267b080]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1129] manager[0x559ac267b080]: rfkill: WWAN hardware radio set enabled
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1168] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1169] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1170] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1172] manager: Networking is enabled by state file
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1174] settings: Loaded settings plugin: keyfile (internal)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1194] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1217] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1230] dhcp: init: Using DHCP client 'internal'
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1233] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1248] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1256] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1263] device (lo): Activation: starting connection 'lo' (2464996e-5b9c-4662-8d02-714ba4ef3d59)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1273] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1277] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1310] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1315] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1319] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1322] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1325] device (eth0): carrier: link connected
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1329] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1336] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1343] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1348] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1350] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1352] manager: NetworkManager state is now CONNECTING
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1354] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1362] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1365] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1399] dhcp4 (eth0): state changed new lease, address=38.129.56.212
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1409] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1434] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Started Network Manager.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Reached target Network.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Reached target NFS client services.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Reached target Remote File Systems.
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1763] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1767] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1773] device (lo): Activation: successful, device activated.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1794] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1796] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1799] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1802] device (eth0): Activation: successful, device activated.
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1807] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 16:20:34 np0005539040.novalocal NetworkManager[858]: <info>  [1764346834.1810] manager: startup complete
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 16:20:34 np0005539040.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 28 Nov 2025 16:20:34 +0000. Up 10.82 seconds.
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.129.56.212         | 255.255.255.0 | global | fa:16:3e:1c:43:97 |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe1c:4397/64 |       .       |  link  | fa:16:3e:1c:43:97 |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 28 16:20:34 np0005539040.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Nov 28 16:20:35 np0005539040.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Generating public/private rsa key pair.
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key fingerprint is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: SHA256:farp3E8jQRJhDG0UTnBH/4XcH2zDvYtDge7Wui8NH34 root@np0005539040.novalocal
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key's randomart image is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +---[RSA 3072]----+
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |      o*O+o      |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |       =+o ...oo.|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |       .o ....o*+|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |         +.  .o.=|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |        S o......|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |          .=o.. .|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |          ooB+.. |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |       . +.+.=.E |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |       .= .+=..  |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key fingerprint is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: SHA256:DFdfT6E7ulfSDaqAF7/+BQWg4+jN/nwIEieTFXe+sv8 root@np0005539040.novalocal
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key's randomart image is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +---[ECDSA 256]---+
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |        ..+.o ..o|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |        .+ + o.o |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |      .o+   o.. .|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |      =*o.   oo  |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |      .*So. o+...|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |     .oo+ .ooo.o.|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |      .ooo.=  +  |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |       . .+.oo   |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |        .o+++.E  |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key fingerprint is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: SHA256:HntNn97g1PYoGDYo0F24Ohl+GG94IkJs7jg8IumR/1M root@np0005539040.novalocal
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: The key's randomart image is:
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +--[ED25519 256]--+
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |         .       |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |        . .      |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: | .   . . o       |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |  + . + o        |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: | +   o OS.  .    |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |  + . XE*o+o . o |
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |.* . ..Bo..+. = o|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |*o+  .   .. .+ =.|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: |+o.....      .+ o|
Nov 28 16:20:35 np0005539040.novalocal cloud-init[922]: +----[SHA256]-----+
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Reached target Network is Online.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting System Logging Service...
Nov 28 16:20:35 np0005539040.novalocal sm-notify[1004]: Version 2.5.4 starting
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting Permit User Sessions...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 28 16:20:35 np0005539040.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Nov 28 16:20:35 np0005539040.novalocal sshd[1006]: Server listening on :: port 22.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Finished Permit User Sessions.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started Command Scheduler.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started Getty on tty1.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 28 16:20:35 np0005539040.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Nov 28 16:20:35 np0005539040.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 28 16:20:35 np0005539040.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 16% if used.)
Nov 28 16:20:35 np0005539040.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Reached target Login Prompts.
Nov 28 16:20:35 np0005539040.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Nov 28 16:20:35 np0005539040.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Started System Logging Service.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Reached target Multi-User System.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 28 16:20:35 np0005539040.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 28 16:20:35 np0005539040.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1048]: Connection reset by 38.102.83.114 port 59884 [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1064]: Unable to negotiate with 38.102.83.114 port 59900: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1066]: Connection reset by 38.102.83.114 port 59916 [preauth]
Nov 28 16:20:36 np0005539040.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Nov 28 16:20:36 np0005539040.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1070]: Unable to negotiate with 38.102.83.114 port 59930: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1077]: Unable to negotiate with 38.102.83.114 port 59944: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1084]: Connection closed by 38.102.83.114 port 59958 [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1144]: Unable to negotiate with 38.102.83.114 port 59982: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1151]: Unable to negotiate with 38.102.83.114 port 59994: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 28 16:20:36 np0005539040.novalocal sshd-session[1103]: Connection closed by 38.102.83.114 port 59968 [preauth]
Nov 28 16:20:36 np0005539040.novalocal cloud-init[1263]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 28 Nov 2025 16:20:36 +0000. Up 12.59 seconds.
Nov 28 16:20:36 np0005539040.novalocal dracut[1282]: dracut-057-102.git20250818.el9
Nov 28 16:20:36 np0005539040.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 28 16:20:36 np0005539040.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 28 16:20:36 np0005539040.novalocal dracut[1284]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1371]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 28 Nov 2025 16:20:37 +0000. Up 13.17 seconds.
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1385]: #############################################################
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1387]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1391]: 256 SHA256:DFdfT6E7ulfSDaqAF7/+BQWg4+jN/nwIEieTFXe+sv8 root@np0005539040.novalocal (ECDSA)
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1396]: 256 SHA256:HntNn97g1PYoGDYo0F24Ohl+GG94IkJs7jg8IumR/1M root@np0005539040.novalocal (ED25519)
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1398]: 3072 SHA256:farp3E8jQRJhDG0UTnBH/4XcH2zDvYtDge7Wui8NH34 root@np0005539040.novalocal (RSA)
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1399]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1402]: #############################################################
Nov 28 16:20:37 np0005539040.novalocal cloud-init[1371]: Cloud-init v. 24.4-7.el9 finished at Fri, 28 Nov 2025 16:20:37 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.36 seconds
Nov 28 16:20:37 np0005539040.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 28 16:20:37 np0005539040.novalocal systemd[1]: Reached target Cloud-init target.
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: memstrack is not available
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 16:20:37 np0005539040.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: memstrack is not available
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: *** Including module: systemd ***
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: *** Including module: fips ***
Nov 28 16:20:38 np0005539040.novalocal dracut[1284]: *** Including module: systemd-initrd ***
Nov 28 16:20:39 np0005539040.novalocal dracut[1284]: *** Including module: i18n ***
Nov 28 16:20:39 np0005539040.novalocal dracut[1284]: *** Including module: drm ***
Nov 28 16:20:39 np0005539040.novalocal chronyd[792]: Selected source 206.108.0.131 (2.centos.pool.ntp.org)
Nov 28 16:20:39 np0005539040.novalocal chronyd[792]: System clock wrong by 1.317182 seconds
Nov 28 16:20:40 np0005539040.novalocal chronyd[792]: System clock was stepped by 1.317182 seconds
Nov 28 16:20:40 np0005539040.novalocal chronyd[792]: System clock TAI offset set to 37 seconds
Nov 28 16:20:40 np0005539040.novalocal dracut[1284]: *** Including module: prefixdevname ***
Nov 28 16:20:40 np0005539040.novalocal dracut[1284]: *** Including module: kernel-modules ***
Nov 28 16:20:40 np0005539040.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: kernel-modules-extra ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: qemu ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: fstab-sys ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: rootfs-block ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: terminfo ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: *** Including module: udev-rules ***
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: Skipping udev rule: 91-permissions.rules
Nov 28 16:20:41 np0005539040.novalocal dracut[1284]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: virtiofs ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: dracut-systemd ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: usrmount ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: base ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: fs-lib ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: kdumpbase ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:   microcode_ctl module: mangling fw_dir
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel" is ignored
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 28 16:20:42 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]: *** Including module: openssl ***
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]: *** Including module: shutdown ***
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]: *** Including module: squash ***
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]: *** Including modules done ***
Nov 28 16:20:43 np0005539040.novalocal dracut[1284]: *** Installing kernel module dependencies ***
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 25 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 31 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 28 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 32 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 30 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 28 16:20:44 np0005539040.novalocal irqbalance[780]: IRQ 29 affinity is now unmanaged
Nov 28 16:20:44 np0005539040.novalocal dracut[1284]: *** Installing kernel module dependencies done ***
Nov 28 16:20:44 np0005539040.novalocal dracut[1284]: *** Resolving executable dependencies ***
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: *** Resolving executable dependencies done ***
Nov 28 16:20:45 np0005539040.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: *** Generating early-microcode cpio image ***
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: *** Store current command line parameters ***
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: Stored kernel commandline:
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: No dracut internal kernel commandline stored in the initramfs
Nov 28 16:20:45 np0005539040.novalocal dracut[1284]: *** Install squash loader ***
Nov 28 16:20:46 np0005539040.novalocal dracut[1284]: *** Squashing the files inside the initramfs ***
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: *** Squashing the files inside the initramfs done ***
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: *** Hardlinking files ***
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Mode:           real
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Files:          50
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Linked:         0 files
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Compared:       0 xattrs
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Compared:       0 files
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Saved:          0 B
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: Duration:       0.000475 seconds
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: *** Hardlinking files done ***
Nov 28 16:20:48 np0005539040.novalocal dracut[1284]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 28 16:20:50 np0005539040.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Nov 28 16:20:50 np0005539040.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Nov 28 16:20:50 np0005539040.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 28 16:20:50 np0005539040.novalocal systemd[1]: Startup finished in 3.219s (kernel) + 3.105s (initrd) + 18.614s (userspace) = 24.940s.
Nov 28 16:21:05 np0005539040.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 16:26:14 np0005539040.novalocal sshd-session[4297]: Connection closed by 188.166.104.67 port 37534
Nov 28 16:27:29 np0005539040.novalocal sshd-session[4299]: Connection closed by authenticating user root 188.166.104.67 port 46160 [preauth]
Nov 28 16:28:05 np0005539040.novalocal sshd-session[4301]: Accepted publickey for zuul from 38.102.83.114 port 56120 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 28 16:28:05 np0005539040.novalocal systemd-logind[788]: New session 1 of user zuul.
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Queued start job for default target Main User Target.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Created slice User Application Slice.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Reached target Paths.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Reached target Timers.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Starting D-Bus User Message Bus Socket...
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Starting Create User's Volatile Files and Directories...
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Listening on D-Bus User Message Bus Socket.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Reached target Sockets.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Finished Create User's Volatile Files and Directories.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Reached target Basic System.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Reached target Main User Target.
Nov 28 16:28:05 np0005539040.novalocal systemd[4305]: Startup finished in 119ms.
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 28 16:28:05 np0005539040.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 28 16:28:05 np0005539040.novalocal sshd-session[4301]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:28:06 np0005539040.novalocal python3[4389]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 16:28:09 np0005539040.novalocal python3[4417]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 16:28:11 np0005539040.novalocal sshd-session[4452]: Connection closed by authenticating user root 188.166.104.67 port 40590 [preauth]
Nov 28 16:28:17 np0005539040.novalocal python3[4477]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 16:28:18 np0005539040.novalocal python3[4517]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 28 16:28:20 np0005539040.novalocal python3[4543]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCj0Wi8HpeH+V+mJFwLDGsHZ14Xb19j1UaiWEp+sgQzyJe6S7+DDXgqgngnQi0jIrAvvk2F/46TKWP0quSXSvBfvB3x2B6Z4vnhHqn/Vb5GnGiTZAwSj76yZIAMNdBHy+oO9BkDQgUXJnyCtxkMzLEh/e6Q3ay8hQxcKhueOZGiPQ7Rq7yP4UynTbHr9T5rjQl0q9jhPZV9SUjNHzpBo+JzncQWxGT4450aKZ5bHAYY4gAcerHpEsdofr39eDjvo5htfxY6SzVymvAK/RYrPxu2qfbAO+lqc4ouRDBXQeOl2Z0SZltNbC+sVnGXb7JeEvo0lndRRC2LKPtLQOYr1M6e63wFfJvs05aX2w71HZ9DTf/m1YFSqY1umUfy6urwFrpmu8mfHM0bYaYoDgeVRPPNvzjh7ziWUm+eEB3TtVtm0zbhril3BogAAAkiOkggSB22l04nPiJrbTMv0XvGkcnDnvfH6zWzrrLJtBlnkLkiepIdfPaIGYhYA+U4jWlI/C0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:20 np0005539040.novalocal python3[4567]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:21 np0005539040.novalocal python3[4666]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:21 np0005539040.novalocal python3[4737]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764347301.1150684-229-200139857676863/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=75e802ee030a404cbb184bc0e3e173a6_id_rsa follow=False checksum=6ba6c2f51aac988d4f3c17baf3931b842f2413d8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:22 np0005539040.novalocal python3[4860]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:22 np0005539040.novalocal python3[4931]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764347302.0713766-273-275308608432515/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=75e802ee030a404cbb184bc0e3e173a6_id_rsa.pub follow=False checksum=f80d920f429ce364f82b12ec5dde3fd91099cc68 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:23 np0005539040.novalocal python3[4979]: ansible-ping Invoked with data=pong
Nov 28 16:28:24 np0005539040.novalocal python3[5003]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 16:28:26 np0005539040.novalocal python3[5061]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 28 16:28:27 np0005539040.novalocal python3[5093]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:28 np0005539040.novalocal python3[5117]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:28 np0005539040.novalocal python3[5141]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:28 np0005539040.novalocal python3[5165]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:28 np0005539040.novalocal python3[5189]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:29 np0005539040.novalocal python3[5213]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:30 np0005539040.novalocal sudo[5237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtcodnprutjabhpkkvedojskohvsiyf ; /usr/bin/python3'
Nov 28 16:28:30 np0005539040.novalocal sudo[5237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:30 np0005539040.novalocal python3[5239]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:30 np0005539040.novalocal sudo[5237]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:31 np0005539040.novalocal sudo[5315]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqxlwgsodbajazgmoaanltcygmnamrje ; /usr/bin/python3'
Nov 28 16:28:31 np0005539040.novalocal sudo[5315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:31 np0005539040.novalocal python3[5317]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:31 np0005539040.novalocal sudo[5315]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:31 np0005539040.novalocal sudo[5388]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hubelvseszmxolwkczqbcatdddsbemzn ; /usr/bin/python3'
Nov 28 16:28:31 np0005539040.novalocal sudo[5388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:31 np0005539040.novalocal python3[5390]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764347311.022142-26-179448882811478/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:31 np0005539040.novalocal sudo[5388]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:32 np0005539040.novalocal python3[5438]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:32 np0005539040.novalocal python3[5462]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:33 np0005539040.novalocal python3[5486]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:33 np0005539040.novalocal python3[5510]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:33 np0005539040.novalocal python3[5534]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:33 np0005539040.novalocal python3[5558]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:34 np0005539040.novalocal python3[5582]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:34 np0005539040.novalocal python3[5606]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:34 np0005539040.novalocal python3[5630]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:34 np0005539040.novalocal python3[5654]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:35 np0005539040.novalocal python3[5678]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:35 np0005539040.novalocal python3[5702]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:35 np0005539040.novalocal python3[5726]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:36 np0005539040.novalocal python3[5750]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:36 np0005539040.novalocal python3[5774]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:36 np0005539040.novalocal python3[5798]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:36 np0005539040.novalocal python3[5822]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:37 np0005539040.novalocal python3[5846]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:37 np0005539040.novalocal python3[5870]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:37 np0005539040.novalocal python3[5894]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:37 np0005539040.novalocal python3[5918]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:38 np0005539040.novalocal python3[5942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:38 np0005539040.novalocal python3[5966]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:38 np0005539040.novalocal python3[5990]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:39 np0005539040.novalocal python3[6014]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:39 np0005539040.novalocal python3[6038]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:28:42 np0005539040.novalocal sudo[6062]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwpqqjpibluoobvaclzpwleailkqvaaw ; /usr/bin/python3'
Nov 28 16:28:42 np0005539040.novalocal sudo[6062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:42 np0005539040.novalocal python3[6064]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 16:28:42 np0005539040.novalocal systemd[1]: Starting Time & Date Service...
Nov 28 16:28:42 np0005539040.novalocal systemd[1]: Started Time & Date Service.
Nov 28 16:28:42 np0005539040.novalocal systemd-timedated[6066]: Changed time zone to 'UTC' (UTC).
Nov 28 16:28:42 np0005539040.novalocal sudo[6062]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:43 np0005539040.novalocal sudo[6094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irafqgjejzomfcqpdvnpjuisauteyfoh ; /usr/bin/python3'
Nov 28 16:28:43 np0005539040.novalocal sudo[6094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:43 np0005539040.novalocal python3[6096]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:43 np0005539040.novalocal sudo[6094]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:44 np0005539040.novalocal python3[6172]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:44 np0005539040.novalocal python3[6243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764347324.0997455-202-6593334878775/source _original_basename=tmpvm5u_n_d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:45 np0005539040.novalocal python3[6343]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:45 np0005539040.novalocal python3[6414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764347324.9155579-242-133461823236389/source _original_basename=tmp0h21fvno follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:46 np0005539040.novalocal sudo[6514]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygsbuextzfcejjpvtzvcmukhoqkyluos ; /usr/bin/python3'
Nov 28 16:28:46 np0005539040.novalocal sudo[6514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:46 np0005539040.novalocal python3[6516]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:46 np0005539040.novalocal sudo[6514]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:46 np0005539040.novalocal sudo[6587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lssioovomlwtotjdfklwtgrosenpxvky ; /usr/bin/python3'
Nov 28 16:28:46 np0005539040.novalocal sudo[6587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:46 np0005539040.novalocal python3[6589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764347326.1833427-306-213745706744115/source _original_basename=tmpdem3gixk follow=False checksum=7231cdd791bcee3db9aef032c5c5e69dc3ceaf61 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:46 np0005539040.novalocal sudo[6587]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:47 np0005539040.novalocal python3[6637]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:28:47 np0005539040.novalocal python3[6663]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:28:47 np0005539040.novalocal sudo[6741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pffgkjfftfsdcjhcotielrztdqkxnozg ; /usr/bin/python3'
Nov 28 16:28:47 np0005539040.novalocal sudo[6741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:47 np0005539040.novalocal python3[6743]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:28:48 np0005539040.novalocal sudo[6741]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:48 np0005539040.novalocal sudo[6814]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqhkqazrmcaabyblqdnokqbhhsfvjjf ; /usr/bin/python3'
Nov 28 16:28:48 np0005539040.novalocal sudo[6814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:48 np0005539040.novalocal python3[6816]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764347327.749946-362-164928322726876/source _original_basename=tmptr0q5jca follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:48 np0005539040.novalocal sudo[6814]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:48 np0005539040.novalocal sudo[6865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kradvxmmsmwdqciocmklpqblpjxlmflf ; /usr/bin/python3'
Nov 28 16:28:48 np0005539040.novalocal sudo[6865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:28:48 np0005539040.novalocal python3[6867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-e502-97fd-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:28:48 np0005539040.novalocal sudo[6865]: pam_unix(sudo:session): session closed for user root
Nov 28 16:28:49 np0005539040.novalocal python3[6895]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-e502-97fd-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 28 16:28:50 np0005539040.novalocal python3[6923]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:28:54 np0005539040.novalocal sshd-session[6924]: Connection closed by authenticating user root 188.166.104.67 port 59922 [preauth]
Nov 28 16:29:12 np0005539040.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 16:29:15 np0005539040.novalocal sudo[6951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drgqlxiwmflwbacjqoqsalqxfqzgisxj ; /usr/bin/python3'
Nov 28 16:29:15 np0005539040.novalocal sudo[6951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:29:15 np0005539040.novalocal python3[6953]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:29:15 np0005539040.novalocal sudo[6951]: pam_unix(sudo:session): session closed for user root
Nov 28 16:29:35 np0005539040.novalocal sshd-session[6954]: Connection closed by authenticating user root 188.166.104.67 port 54728 [preauth]
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 28 16:29:53 np0005539040.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 28 16:29:53 np0005539040.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1287] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 16:29:53 np0005539040.novalocal systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1436] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1467] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1471] device (eth1): carrier: link connected
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1474] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1479] policy: auto-activating connection 'Wired connection 1' (3805b732-7d96-3b33-9481-4cdfa6af1193)
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1484] device (eth1): Activation: starting connection 'Wired connection 1' (3805b732-7d96-3b33-9481-4cdfa6af1193)
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1485] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1487] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1492] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 16:29:53 np0005539040.novalocal NetworkManager[858]: <info>  [1764347393.1496] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:29:53 np0005539040.novalocal python3[6983]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-5720-4d5a-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:30:00 np0005539040.novalocal sudo[7061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvwnaumpwhbfgeylmysmuzpxjszcoeoy ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 28 16:30:00 np0005539040.novalocal sudo[7061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:30:01 np0005539040.novalocal python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:30:01 np0005539040.novalocal sudo[7061]: pam_unix(sudo:session): session closed for user root
Nov 28 16:30:01 np0005539040.novalocal sudo[7134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkqtorepmsrfqnfedrhkacjgrbgohrpj ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 28 16:30:01 np0005539040.novalocal sudo[7134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:30:01 np0005539040.novalocal python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764347400.708563-103-35770478794108/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=fba291f113da634eba7dd6f3c66b5edfcb592b3b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:30:01 np0005539040.novalocal sudo[7134]: pam_unix(sudo:session): session closed for user root
Nov 28 16:30:01 np0005539040.novalocal sudo[7184]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azrrltskhazmptuxhvdnlrfiuuscbbpl ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 28 16:30:01 np0005539040.novalocal sudo[7184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:30:02 np0005539040.novalocal python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Stopping Network Manager...
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1759] caught SIGTERM, shutting down normally.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1772] dhcp4 (eth0): canceled DHCP transaction
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1773] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1773] dhcp4 (eth0): state changed no lease
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1775] manager: NetworkManager state is now CONNECTING
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1899] dhcp4 (eth1): canceled DHCP transaction
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.1899] dhcp4 (eth1): state changed no lease
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[858]: <info>  [1764347402.2934] exiting (success)
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Stopped Network Manager.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: NetworkManager.service: Consumed 3.318s CPU time, 9.9M memory peak.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Starting Network Manager...
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.3658] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:554a2c9a-1114-4202-a0a9-67957e011662)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.3660] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.3721] manager[0x55bc9633b070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Starting Hostname Service...
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Started Hostname Service.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4507] hostname: hostname: using hostnamed
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4507] hostname: static hostname changed from (none) to "np0005539040.novalocal"
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4513] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4517] manager[0x55bc9633b070]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4518] manager[0x55bc9633b070]: rfkill: WWAN hardware radio set enabled
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4553] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4554] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4554] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4554] manager: Networking is enabled by state file
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4556] settings: Loaded settings plugin: keyfile (internal)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4564] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4590] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4599] dhcp: init: Using DHCP client 'internal'
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4602] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4606] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4612] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4618] device (lo): Activation: starting connection 'lo' (2464996e-5b9c-4662-8d02-714ba4ef3d59)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4623] device (eth0): carrier: link connected
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4626] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4630] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4630] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4635] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4639] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4644] device (eth1): carrier: link connected
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4646] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4649] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (3805b732-7d96-3b33-9481-4cdfa6af1193) (indicated)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4650] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4653] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4658] device (eth1): Activation: starting connection 'Wired connection 1' (3805b732-7d96-3b33-9481-4cdfa6af1193)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4663] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Started Network Manager.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4681] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4683] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4685] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4686] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4689] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4691] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4693] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4695] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4700] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4702] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4708] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4710] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4723] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4728] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4733] device (lo): Activation: successful, device activated.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4739] dhcp4 (eth0): state changed new lease, address=38.129.56.212
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4745] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 16:30:02 np0005539040.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4800] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4830] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4832] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4834] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4837] device (eth0): Activation: successful, device activated.
Nov 28 16:30:02 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347402.4841] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 16:30:02 np0005539040.novalocal sudo[7184]: pam_unix(sudo:session): session closed for user root
Nov 28 16:30:02 np0005539040.novalocal python3[7271]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-5720-4d5a-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:30:12 np0005539040.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 16:30:15 np0005539040.novalocal sshd-session[7274]: Connection closed by authenticating user root 188.166.104.67 port 51300 [preauth]
Nov 28 16:30:32 np0005539040.novalocal systemd[4305]: Starting Mark boot as successful...
Nov 28 16:30:32 np0005539040.novalocal systemd[4305]: Finished Mark boot as successful.
Nov 28 16:30:32 np0005539040.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1493] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 16:30:48 np0005539040.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 16:30:48 np0005539040.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1847] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1850] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1858] device (eth1): Activation: successful, device activated.
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1864] manager: startup complete
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1867] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <warn>  [1764347448.1872] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1878] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1968] dhcp4 (eth1): canceled DHCP transaction
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1968] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1968] dhcp4 (eth1): state changed no lease
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1981] policy: auto-activating connection 'ci-private-network' (3602a993-b6bf-5e94-b722-2dfc0f2c6254)
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1985] device (eth1): Activation: starting connection 'ci-private-network' (3602a993-b6bf-5e94-b722-2dfc0f2c6254)
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1986] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1988] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.1996] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.2004] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.2045] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.2046] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 16:30:48 np0005539040.novalocal NetworkManager[7203]: <info>  [1764347448.2052] device (eth1): Activation: successful, device activated.
Nov 28 16:30:53 np0005539040.novalocal sshd-session[7304]: Connection closed by authenticating user root 188.166.104.67 port 51418 [preauth]
Nov 28 16:30:58 np0005539040.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 16:31:02 np0005539040.novalocal sshd-session[4316]: Received disconnect from 38.102.83.114 port 56120:11: disconnected by user
Nov 28 16:31:02 np0005539040.novalocal sshd-session[4316]: Disconnected from user zuul 38.102.83.114 port 56120
Nov 28 16:31:02 np0005539040.novalocal sshd-session[4301]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:31:02 np0005539040.novalocal systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Nov 28 16:31:30 np0005539040.novalocal sshd-session[7306]: Connection closed by authenticating user root 188.166.104.67 port 37278 [preauth]
Nov 28 16:31:30 np0005539040.novalocal sshd-session[7308]: Accepted publickey for zuul from 38.102.83.114 port 37900 ssh2: RSA SHA256:egEXJ6YZqvJAQvB4hRXcAb06pxwLPK5iz3to2ybg6o8
Nov 28 16:31:30 np0005539040.novalocal systemd-logind[788]: New session 3 of user zuul.
Nov 28 16:31:30 np0005539040.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 28 16:31:30 np0005539040.novalocal sshd-session[7308]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:31:30 np0005539040.novalocal sudo[7387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvdzcovyiankcjhzvnucfdxrmazocdtb ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 28 16:31:30 np0005539040.novalocal sudo[7387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:31:31 np0005539040.novalocal python3[7389]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:31:31 np0005539040.novalocal sudo[7387]: pam_unix(sudo:session): session closed for user root
Nov 28 16:31:31 np0005539040.novalocal sudo[7460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjaftbvgjoipwpisjogrqvvelnourrzm ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 28 16:31:31 np0005539040.novalocal sudo[7460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:31:31 np0005539040.novalocal python3[7462]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764347490.8667903-309-128231331595518/source _original_basename=tmp6dbnaq_m follow=False checksum=e43d42ce20b3afc3f5e59af19912bc83e259d481 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:31:31 np0005539040.novalocal sudo[7460]: pam_unix(sudo:session): session closed for user root
Nov 28 16:31:34 np0005539040.novalocal sshd-session[7311]: Connection closed by 38.102.83.114 port 37900
Nov 28 16:31:34 np0005539040.novalocal sshd-session[7308]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:31:34 np0005539040.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 28 16:31:34 np0005539040.novalocal systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Nov 28 16:31:34 np0005539040.novalocal systemd-logind[788]: Removed session 3.
Nov 28 16:32:07 np0005539040.novalocal sshd-session[7488]: Connection closed by authenticating user root 188.166.104.67 port 54274 [preauth]
Nov 28 16:32:41 np0005539040.novalocal sshd-session[7491]: Connection closed by authenticating user root 188.166.104.67 port 48616 [preauth]
Nov 28 16:33:16 np0005539040.novalocal sshd-session[7493]: Connection closed by authenticating user root 188.166.104.67 port 52648 [preauth]
Nov 28 16:33:32 np0005539040.novalocal systemd[4305]: Created slice User Background Tasks Slice.
Nov 28 16:33:32 np0005539040.novalocal systemd[4305]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 16:33:32 np0005539040.novalocal systemd[4305]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 16:33:51 np0005539040.novalocal sshd-session[7496]: Connection closed by authenticating user root 188.166.104.67 port 54986 [preauth]
Nov 28 16:34:28 np0005539040.novalocal sshd-session[7498]: Connection closed by authenticating user root 188.166.104.67 port 49458 [preauth]
Nov 28 16:35:04 np0005539040.novalocal sshd-session[7500]: Connection closed by authenticating user root 188.166.104.67 port 40416 [preauth]
Nov 28 16:35:32 np0005539040.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Nov 28 16:35:32 np0005539040.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 28 16:35:32 np0005539040.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Nov 28 16:35:32 np0005539040.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 28 16:35:40 np0005539040.novalocal sshd-session[7504]: Connection closed by authenticating user root 188.166.104.67 port 48456 [preauth]
Nov 28 16:36:15 np0005539040.novalocal sshd-session[7506]: Connection closed by authenticating user root 188.166.104.67 port 51578 [preauth]
Nov 28 16:36:50 np0005539040.novalocal sshd-session[7508]: Connection closed by authenticating user root 188.166.104.67 port 59654 [preauth]
Nov 28 16:37:10 np0005539040.novalocal sshd-session[7511]: Accepted publickey for zuul from 38.102.83.114 port 48736 ssh2: RSA SHA256:egEXJ6YZqvJAQvB4hRXcAb06pxwLPK5iz3to2ybg6o8
Nov 28 16:37:10 np0005539040.novalocal systemd-logind[788]: New session 4 of user zuul.
Nov 28 16:37:10 np0005539040.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 28 16:37:10 np0005539040.novalocal sshd-session[7511]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:37:11 np0005539040.novalocal sudo[7538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntkqmemgyceabxsmckcmgytnolvvdwce ; /usr/bin/python3'
Nov 28 16:37:11 np0005539040.novalocal sudo[7538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:11 np0005539040.novalocal python3[7540]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-6b04-72d1-000000000c9d-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:11 np0005539040.novalocal sudo[7538]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:11 np0005539040.novalocal sudo[7566]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfswpzfnghlxaectqvsbwmcvhjpboaun ; /usr/bin/python3'
Nov 28 16:37:11 np0005539040.novalocal sudo[7566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:11 np0005539040.novalocal python3[7568]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:11 np0005539040.novalocal sudo[7566]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:11 np0005539040.novalocal sudo[7592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwacxiuujauujdffgjcacqoaqytkzvlv ; /usr/bin/python3'
Nov 28 16:37:11 np0005539040.novalocal sudo[7592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:11 np0005539040.novalocal python3[7594]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:11 np0005539040.novalocal sudo[7592]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:11 np0005539040.novalocal sudo[7619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzrpzvmezarnytrbvlzrqbfsfzptcatv ; /usr/bin/python3'
Nov 28 16:37:11 np0005539040.novalocal sudo[7619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:12 np0005539040.novalocal python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:12 np0005539040.novalocal sudo[7619]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:12 np0005539040.novalocal sudo[7645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wezcqmlhzvllxpnctidicwzfwcvdsvku ; /usr/bin/python3'
Nov 28 16:37:12 np0005539040.novalocal sudo[7645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:12 np0005539040.novalocal python3[7647]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:12 np0005539040.novalocal sudo[7645]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:12 np0005539040.novalocal sudo[7671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgcxjpcftudkmmsmuljeibgemmebywa ; /usr/bin/python3'
Nov 28 16:37:12 np0005539040.novalocal sudo[7671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:13 np0005539040.novalocal python3[7673]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:13 np0005539040.novalocal sudo[7671]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:13 np0005539040.novalocal sudo[7749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zivsqfbqqnzyqfwaqmtwnwssqzkzwlde ; /usr/bin/python3'
Nov 28 16:37:13 np0005539040.novalocal sudo[7749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:13 np0005539040.novalocal python3[7751]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:37:13 np0005539040.novalocal sudo[7749]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:13 np0005539040.novalocal sudo[7822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsjofcienchojwuwktsgzlxnschjhety ; /usr/bin/python3'
Nov 28 16:37:13 np0005539040.novalocal sudo[7822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:13 np0005539040.novalocal python3[7824]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764347833.254411-345-169333629378104/source _original_basename=tmp4077lmuj follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:37:13 np0005539040.novalocal sudo[7822]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:14 np0005539040.novalocal sudo[7872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acniqvytlqhcaumrknltrtnkhtcnnkbr ; /usr/bin/python3'
Nov 28 16:37:14 np0005539040.novalocal sudo[7872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:14 np0005539040.novalocal python3[7874]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 16:37:14 np0005539040.novalocal systemd[1]: Reloading.
Nov 28 16:37:14 np0005539040.novalocal systemd-rc-local-generator[7895]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 16:37:15 np0005539040.novalocal sudo[7872]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:16 np0005539040.novalocal sudo[7928]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padoekfmcntdwcqwztcwfdxaeqxorpnv ; /usr/bin/python3'
Nov 28 16:37:16 np0005539040.novalocal sudo[7928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:16 np0005539040.novalocal python3[7930]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 28 16:37:16 np0005539040.novalocal sudo[7928]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:17 np0005539040.novalocal sudo[7954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoswtollsmnxnyngrvudylebdfoyfhma ; /usr/bin/python3'
Nov 28 16:37:17 np0005539040.novalocal sudo[7954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:17 np0005539040.novalocal python3[7956]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:17 np0005539040.novalocal sudo[7954]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:17 np0005539040.novalocal sudo[7984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvhutearhxjhadgxmdouiumvdplbvjjj ; /usr/bin/python3'
Nov 28 16:37:17 np0005539040.novalocal sudo[7984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:17 np0005539040.novalocal python3[7986]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:17 np0005539040.novalocal sudo[7984]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:17 np0005539040.novalocal sudo[8012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytqvbnzjwbtmhnmjxbhsjlwhyutoklk ; /usr/bin/python3'
Nov 28 16:37:17 np0005539040.novalocal sudo[8012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:18 np0005539040.novalocal python3[8014]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:18 np0005539040.novalocal sshd-session[7959]: Invalid user support from 78.128.112.74 port 49372
Nov 28 16:37:18 np0005539040.novalocal sudo[8012]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:18 np0005539040.novalocal sudo[8040]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdqkmaoqjeebuhxjvrhdillunwicjrll ; /usr/bin/python3'
Nov 28 16:37:18 np0005539040.novalocal sudo[8040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:18 np0005539040.novalocal sshd-session[7959]: Connection closed by invalid user support 78.128.112.74 port 49372 [preauth]
Nov 28 16:37:18 np0005539040.novalocal python3[8042]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:18 np0005539040.novalocal sudo[8040]: pam_unix(sudo:session): session closed for user root
Nov 28 16:37:19 np0005539040.novalocal python3[8069]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-6b04-72d1-000000000ca4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:37:20 np0005539040.novalocal python3[8099]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 16:37:22 np0005539040.novalocal sshd-session[7514]: Connection closed by 38.102.83.114 port 48736
Nov 28 16:37:22 np0005539040.novalocal sshd-session[7511]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:37:22 np0005539040.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 28 16:37:22 np0005539040.novalocal systemd[1]: session-4.scope: Consumed 3.800s CPU time.
Nov 28 16:37:22 np0005539040.novalocal systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Nov 28 16:37:22 np0005539040.novalocal systemd-logind[788]: Removed session 4.
Nov 28 16:37:24 np0005539040.novalocal sshd-session[8107]: Accepted publickey for zuul from 38.102.83.114 port 33516 ssh2: RSA SHA256:egEXJ6YZqvJAQvB4hRXcAb06pxwLPK5iz3to2ybg6o8
Nov 28 16:37:24 np0005539040.novalocal systemd-logind[788]: New session 5 of user zuul.
Nov 28 16:37:24 np0005539040.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 28 16:37:24 np0005539040.novalocal sshd-session[8107]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:37:24 np0005539040.novalocal sshd-session[8104]: Connection closed by authenticating user root 188.166.104.67 port 49632 [preauth]
Nov 28 16:37:24 np0005539040.novalocal sudo[8134]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrrmhbqxhdioiyilhdyprpennnefqlru ; /usr/bin/python3'
Nov 28 16:37:24 np0005539040.novalocal sudo[8134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:37:24 np0005539040.novalocal python3[8136]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 16:37:41 np0005539040.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 16:37:50 np0005539040.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 16:37:58 np0005539040.novalocal sshd-session[8196]: Connection closed by authenticating user root 188.166.104.67 port 43158 [preauth]
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 16:37:59 np0005539040.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 16:38:01 np0005539040.novalocal setsebool[8206]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 28 16:38:01 np0005539040.novalocal setsebool[8206]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 16:38:12 np0005539040.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 16:38:30 np0005539040.novalocal dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 16:38:30 np0005539040.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 16:38:30 np0005539040.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 28 16:38:30 np0005539040.novalocal systemd[1]: Reloading.
Nov 28 16:38:30 np0005539040.novalocal systemd-rc-local-generator[8959]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 16:38:30 np0005539040.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 16:38:31 np0005539040.novalocal sshd-session[9314]: Connection closed by authenticating user root 188.166.104.67 port 53524 [preauth]
Nov 28 16:38:32 np0005539040.novalocal sudo[8134]: pam_unix(sudo:session): session closed for user root
Nov 28 16:38:34 np0005539040.novalocal python3[12745]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-fc6a-21bb-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:38:35 np0005539040.novalocal kernel: evm: overlay not supported
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: Starting D-Bus User Message Bus...
Nov 28 16:38:35 np0005539040.novalocal dbus-broker-launch[13718]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 16:38:35 np0005539040.novalocal dbus-broker-launch[13718]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: Started D-Bus User Message Bus.
Nov 28 16:38:35 np0005539040.novalocal dbus-broker-lau[13718]: Ready
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: Created slice Slice /user.
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: podman-13593.scope: unit configures an IP firewall, but not running as root.
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: (This warning is only shown for the first unit using IP firewalling.)
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: Started podman-13593.scope.
Nov 28 16:38:35 np0005539040.novalocal systemd[4305]: Started podman-pause-8489554c.scope.
Nov 28 16:38:38 np0005539040.novalocal sudo[15163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gouqfklxwjphggchimgpgtrrqnvorljj ; /usr/bin/python3'
Nov 28 16:38:38 np0005539040.novalocal sudo[15163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:38:39 np0005539040.novalocal python3[15175]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.73:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.73:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:38:39 np0005539040.novalocal python3[15175]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 28 16:38:39 np0005539040.novalocal sudo[15163]: pam_unix(sudo:session): session closed for user root
Nov 28 16:38:39 np0005539040.novalocal sshd-session[8110]: Connection closed by 38.102.83.114 port 33516
Nov 28 16:38:39 np0005539040.novalocal sshd-session[8107]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:38:39 np0005539040.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 28 16:38:39 np0005539040.novalocal systemd[1]: session-5.scope: Consumed 1min 364ms CPU time.
Nov 28 16:38:39 np0005539040.novalocal systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Nov 28 16:38:39 np0005539040.novalocal systemd-logind[788]: Removed session 5.
Nov 28 16:38:58 np0005539040.novalocal sshd-session[24578]: Unable to negotiate with 38.129.56.243 port 59482: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 28 16:38:58 np0005539040.novalocal sshd-session[24580]: Connection closed by 38.129.56.243 port 59466 [preauth]
Nov 28 16:38:58 np0005539040.novalocal sshd-session[24585]: Connection closed by 38.129.56.243 port 59472 [preauth]
Nov 28 16:38:58 np0005539040.novalocal sshd-session[24586]: Unable to negotiate with 38.129.56.243 port 59480: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 28 16:38:58 np0005539040.novalocal sshd-session[24583]: Unable to negotiate with 38.129.56.243 port 59492: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 28 16:39:03 np0005539040.novalocal sshd-session[26726]: Accepted publickey for zuul from 38.102.83.114 port 43868 ssh2: RSA SHA256:egEXJ6YZqvJAQvB4hRXcAb06pxwLPK5iz3to2ybg6o8
Nov 28 16:39:03 np0005539040.novalocal systemd-logind[788]: New session 6 of user zuul.
Nov 28 16:39:04 np0005539040.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 28 16:39:04 np0005539040.novalocal sshd-session[26726]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:39:04 np0005539040.novalocal python3[26821]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO97aOeom13lvEbcA0huPU7aKwR6qk5ldRtXYiXoR3CD0kEGXFB5h1nxeoWlIRZfeSMVmNzHSZJRHPa46BYMvB4= zuul@np0005539039.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:39:04 np0005539040.novalocal sudo[26997]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavurbckxhfiuwkdnvdxeyrpgycfueqo ; /usr/bin/python3'
Nov 28 16:39:04 np0005539040.novalocal sudo[26997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:04 np0005539040.novalocal python3[27006]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO97aOeom13lvEbcA0huPU7aKwR6qk5ldRtXYiXoR3CD0kEGXFB5h1nxeoWlIRZfeSMVmNzHSZJRHPa46BYMvB4= zuul@np0005539039.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:39:04 np0005539040.novalocal sudo[26997]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:05 np0005539040.novalocal sshd-session[27067]: Connection closed by authenticating user root 188.166.104.67 port 33622 [preauth]
Nov 28 16:39:05 np0005539040.novalocal sudo[27408]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmjtzdnrcmidgdunugymbectvmxbirow ; /usr/bin/python3'
Nov 28 16:39:05 np0005539040.novalocal sudo[27408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:05 np0005539040.novalocal python3[27417]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539040.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 16:39:05 np0005539040.novalocal useradd[27496]: new group: name=cloud-admin, GID=1002
Nov 28 16:39:05 np0005539040.novalocal useradd[27496]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 28 16:39:05 np0005539040.novalocal sudo[27408]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:05 np0005539040.novalocal sudo[27636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzgoljyzdypestuvkfbvovtzswfvfsol ; /usr/bin/python3'
Nov 28 16:39:05 np0005539040.novalocal sudo[27636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:06 np0005539040.novalocal python3[27645]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO97aOeom13lvEbcA0huPU7aKwR6qk5ldRtXYiXoR3CD0kEGXFB5h1nxeoWlIRZfeSMVmNzHSZJRHPa46BYMvB4= zuul@np0005539039.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 16:39:06 np0005539040.novalocal sudo[27636]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:06 np0005539040.novalocal sudo[27901]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvtcsyvjnugdzmbiqvidcgxpzqgntngq ; /usr/bin/python3'
Nov 28 16:39:06 np0005539040.novalocal sudo[27901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:06 np0005539040.novalocal python3[27908]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:39:06 np0005539040.novalocal sudo[27901]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:06 np0005539040.novalocal sudo[28161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfibafggwafjllnhomtmjstyeftjoofr ; /usr/bin/python3'
Nov 28 16:39:06 np0005539040.novalocal sudo[28161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:06 np0005539040.novalocal python3[28172]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764347946.2072585-151-141419330304455/source _original_basename=tmpwmo1ujr0 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:39:06 np0005539040.novalocal sudo[28161]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:07 np0005539040.novalocal sudo[28517]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbqwgygdthdqdmfthimohlfgfypyqzdw ; /usr/bin/python3'
Nov 28 16:39:07 np0005539040.novalocal sudo[28517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:39:07 np0005539040.novalocal python3[28527]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 28 16:39:07 np0005539040.novalocal systemd[1]: Starting Hostname Service...
Nov 28 16:39:07 np0005539040.novalocal systemd[1]: Started Hostname Service.
Nov 28 16:39:07 np0005539040.novalocal systemd-hostnamed[28642]: Changed pretty hostname to 'compute-0'
Nov 28 16:39:07 compute-0 systemd-hostnamed[28642]: Hostname set to <compute-0> (static)
Nov 28 16:39:07 compute-0 NetworkManager[7203]: <info>  [1764347947.8640] hostname: static hostname changed from "np0005539040.novalocal" to "compute-0"
Nov 28 16:39:07 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 16:39:07 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 16:39:07 compute-0 sudo[28517]: pam_unix(sudo:session): session closed for user root
Nov 28 16:39:08 compute-0 sshd-session[26757]: Connection closed by 38.102.83.114 port 43868
Nov 28 16:39:08 compute-0 sshd-session[26726]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:39:08 compute-0 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Nov 28 16:39:08 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 28 16:39:08 compute-0 systemd[1]: session-6.scope: Consumed 2.232s CPU time.
Nov 28 16:39:08 compute-0 systemd-logind[788]: Removed session 6.
Nov 28 16:39:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 16:39:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 16:39:10 compute-0 systemd[1]: man-db-cache-update.service: Consumed 48.290s CPU time.
Nov 28 16:39:10 compute-0 systemd[1]: run-rdfb29417dc01407e9287b45c19a5fd63.service: Deactivated successfully.
Nov 28 16:39:17 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 16:39:37 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 16:39:40 compute-0 sshd-session[30014]: Connection closed by authenticating user root 188.166.104.67 port 42996 [preauth]
Nov 28 16:40:13 compute-0 sshd-session[30016]: Connection closed by authenticating user root 188.166.104.67 port 54734 [preauth]
Nov 28 16:40:49 compute-0 sshd-session[30020]: Connection closed by authenticating user root 188.166.104.67 port 59364 [preauth]
Nov 28 16:41:25 compute-0 sshd-session[30023]: Connection closed by authenticating user root 188.166.104.67 port 54136 [preauth]
Nov 28 16:41:39 compute-0 sshd-session[30025]: Connection closed by 121.43.178.245 port 51864
Nov 28 16:42:00 compute-0 sshd-session[30026]: Connection closed by authenticating user root 188.166.104.67 port 60778 [preauth]
Nov 28 16:42:35 compute-0 sshd-session[30028]: Connection closed by authenticating user root 188.166.104.67 port 44368 [preauth]
Nov 28 16:43:11 compute-0 sshd-session[30031]: Connection closed by authenticating user root 188.166.104.67 port 33608 [preauth]
Nov 28 16:43:18 compute-0 sshd-session[30033]: Accepted publickey for zuul from 38.129.56.243 port 33416 ssh2: RSA SHA256:egEXJ6YZqvJAQvB4hRXcAb06pxwLPK5iz3to2ybg6o8
Nov 28 16:43:18 compute-0 systemd-logind[788]: New session 7 of user zuul.
Nov 28 16:43:18 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 28 16:43:18 compute-0 sshd-session[30033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 16:43:19 compute-0 python3[30109]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 16:43:20 compute-0 sudo[30223]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddhhquasuaoypsflodwghwgtyyunuxs ; /usr/bin/python3'
Nov 28 16:43:20 compute-0 sudo[30223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:21 compute-0 python3[30225]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:21 compute-0 sudo[30223]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:21 compute-0 sudo[30296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuubuzsjfsbfjpibuwbkawjtutevdyho ; /usr/bin/python3'
Nov 28 16:43:21 compute-0 sudo[30296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:21 compute-0 python3[30298]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:21 compute-0 sudo[30296]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:21 compute-0 sudo[30322]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvoqgrcbfxsmwnifqtwysgmqejazxhby ; /usr/bin/python3'
Nov 28 16:43:21 compute-0 sudo[30322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:21 compute-0 python3[30324]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:21 compute-0 sudo[30322]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:22 compute-0 sudo[30395]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hytglwezgfltcrduxuyjdagdzignypfg ; /usr/bin/python3'
Nov 28 16:43:22 compute-0 sudo[30395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:22 compute-0 python3[30397]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:22 compute-0 sudo[30395]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:22 compute-0 sudo[30421]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pslhaitsknxgvkhfrxxzcdkafmiqzxpb ; /usr/bin/python3'
Nov 28 16:43:22 compute-0 sudo[30421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:22 compute-0 python3[30423]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:22 compute-0 sudo[30421]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:22 compute-0 sudo[30494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wekcexvtnljpgokbnvaarqlwodvlsoqb ; /usr/bin/python3'
Nov 28 16:43:22 compute-0 sudo[30494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:22 compute-0 python3[30496]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:22 compute-0 sudo[30494]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:22 compute-0 sudo[30520]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzmpmruoiyysrmisvrqalfpccsvttrkk ; /usr/bin/python3'
Nov 28 16:43:22 compute-0 sudo[30520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:22 compute-0 python3[30522]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:22 compute-0 sudo[30520]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:23 compute-0 sudo[30593]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvgjxfkhmgmhyidwftmypzoqyulsppfd ; /usr/bin/python3'
Nov 28 16:43:23 compute-0 sudo[30593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:23 compute-0 python3[30595]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:23 compute-0 sudo[30593]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:23 compute-0 sudo[30619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgwkthizjkilhgpdlwkpmgvqeomtrgod ; /usr/bin/python3'
Nov 28 16:43:23 compute-0 sudo[30619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:23 compute-0 python3[30621]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:23 compute-0 sudo[30619]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:23 compute-0 sudo[30692]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctkmgvucpjbubusrdehwvvyjkzprgmin ; /usr/bin/python3'
Nov 28 16:43:23 compute-0 sudo[30692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:23 compute-0 python3[30694]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:23 compute-0 sudo[30692]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:23 compute-0 sudo[30718]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpktcvlioohgbhmsebnvrvsbrrztnvyr ; /usr/bin/python3'
Nov 28 16:43:23 compute-0 sudo[30718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:24 compute-0 python3[30720]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:24 compute-0 sudo[30718]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:24 compute-0 sudo[30791]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwnrnqzznjdqqntwwokakxcjolyznrf ; /usr/bin/python3'
Nov 28 16:43:24 compute-0 sudo[30791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:24 compute-0 python3[30793]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:24 compute-0 sudo[30791]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:24 compute-0 sudo[30817]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktlcsdylgtjwgtyfzertjbulqfedcfvb ; /usr/bin/python3'
Nov 28 16:43:24 compute-0 sudo[30817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:24 compute-0 python3[30819]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 16:43:24 compute-0 sudo[30817]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:24 compute-0 sudo[30890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjqlwllevaybkxhuyjbfgypaywoijwjq ; /usr/bin/python3'
Nov 28 16:43:24 compute-0 sudo[30890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 16:43:25 compute-0 python3[30892]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764348200.8207695-33865-144648198036905/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 16:43:25 compute-0 sudo[30890]: pam_unix(sudo:session): session closed for user root
Nov 28 16:43:27 compute-0 sshd-session[30918]: Unable to negotiate with 192.168.122.11 port 54482: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 28 16:43:27 compute-0 sshd-session[30917]: Connection closed by 192.168.122.11 port 54452 [preauth]
Nov 28 16:43:27 compute-0 sshd-session[30920]: Connection closed by 192.168.122.11 port 54468 [preauth]
Nov 28 16:43:27 compute-0 sshd-session[30919]: Unable to negotiate with 192.168.122.11 port 54494: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 28 16:43:27 compute-0 sshd-session[30921]: Unable to negotiate with 192.168.122.11 port 54480: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 28 16:43:45 compute-0 sshd-session[30927]: Connection closed by authenticating user root 188.166.104.67 port 47618 [preauth]
Nov 28 16:44:18 compute-0 sshd-session[30929]: Connection closed by authenticating user root 188.166.104.67 port 60884 [preauth]
Nov 28 16:44:32 compute-0 systemd[1]: Starting dnf makecache...
Nov 28 16:44:32 compute-0 dnf[30931]: Failed determining last makecache time.
Nov 28 16:44:33 compute-0 dnf[30931]: delorean-openstack-barbican-42b4c41831408a8e323 116 kB/s |  13 kB     00:00
Nov 28 16:44:33 compute-0 dnf[30931]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 1.5 MB/s |  65 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-cinder-1c00d6490d88e436f26ef  96 kB/s |  32 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-stevedore-c4acc5639fd2329372142 2.7 MB/s | 131 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.1 MB/s |  32 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-os-net-config-9758ab42364673d01bc5014e  13 MB/s | 349 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.9 MB/s |  42 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-designate-tests-tempest-347fdbc 801 kB/s |  18 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-glance-1fd12c29b339f30fe823e 893 kB/s |  18 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.3 MB/s |  29 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-manila-3c01b7181572c95dac462 1.2 MB/s |  25 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-whitebox-neutron-tests-tempest- 5.9 MB/s | 154 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-octavia-ba397f07a7331190208c 1.2 MB/s |  26 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-watcher-c014f81a8647287f6dcc 807 kB/s |  16 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-tcib-1124124ec06aadbac34f0d340b 318 kB/s | 7.4 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.0 MB/s | 144 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-swift-dc98a8463506ac520c469a 610 kB/s |  14 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-python-tempestconf-8515371b7cceebd4282 2.5 MB/s |  53 kB     00:00
Nov 28 16:44:34 compute-0 dnf[30931]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.2 MB/s |  96 kB     00:00
Nov 28 16:44:35 compute-0 dnf[30931]: CentOS Stream 9 - BaseOS                         73 kB/s | 7.3 kB     00:00
Nov 28 16:44:35 compute-0 dnf[30931]: CentOS Stream 9 - AppStream                      27 kB/s | 7.4 kB     00:00
Nov 28 16:44:35 compute-0 dnf[30931]: CentOS Stream 9 - CRB                            79 kB/s | 7.2 kB     00:00
Nov 28 16:44:35 compute-0 dnf[30931]: CentOS Stream 9 - Extras packages                24 kB/s | 8.3 kB     00:00
Nov 28 16:44:35 compute-0 dnf[30931]: dlrn-antelope-testing                            27 MB/s | 1.1 MB     00:00
Nov 28 16:44:36 compute-0 dnf[30931]: dlrn-antelope-build-deps                         15 MB/s | 461 kB     00:00
Nov 28 16:44:36 compute-0 dnf[30931]: centos9-rabbitmq                                804 kB/s | 123 kB     00:00
Nov 28 16:44:36 compute-0 dnf[30931]: centos9-storage                                  23 MB/s | 415 kB     00:00
Nov 28 16:44:36 compute-0 dnf[30931]: centos9-opstools                                4.7 MB/s |  51 kB     00:00
Nov 28 16:44:36 compute-0 dnf[30931]: NFV SIG OpenvSwitch                              19 MB/s | 456 kB     00:00
Nov 28 16:44:37 compute-0 dnf[30931]: repo-setup-centos-appstream                      42 MB/s |  25 MB     00:00
Nov 28 16:44:44 compute-0 dnf[30931]: repo-setup-centos-baseos                        9.3 MB/s | 8.8 MB     00:00
Nov 28 16:44:45 compute-0 dnf[30931]: repo-setup-centos-highavailability               15 MB/s | 744 kB     00:00
Nov 28 16:44:46 compute-0 dnf[30931]: repo-setup-centos-powertools                     67 MB/s | 7.3 MB     00:00
Nov 28 16:44:50 compute-0 dnf[30931]: Extra Packages for Enterprise Linux 9 - x86_64  9.8 MB/s |  20 MB     00:02
Nov 28 16:44:50 compute-0 sshd-session[31032]: Connection closed by authenticating user root 188.166.104.67 port 40026 [preauth]
Nov 28 16:45:03 compute-0 dnf[30931]: Metadata cache created.
Nov 28 16:45:03 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 16:45:03 compute-0 systemd[1]: Finished dnf makecache.
Nov 28 16:45:03 compute-0 systemd[1]: dnf-makecache.service: Consumed 23.960s CPU time.
Nov 28 16:45:23 compute-0 python3[31059]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 16:45:24 compute-0 sshd-session[31061]: Connection closed by authenticating user root 188.166.104.67 port 39898 [preauth]
Nov 28 16:46:02 compute-0 sshd-session[31063]: Connection closed by authenticating user root 188.166.104.67 port 48784 [preauth]
Nov 28 16:46:34 compute-0 sshd-session[31065]: Connection closed by 188.166.104.67 port 47948
Nov 28 16:46:34 compute-0 sshd-session[31066]: Connection closed by authenticating user root 188.166.104.67 port 47950 [preauth]
Nov 28 16:47:05 compute-0 sshd-session[31068]: Connection closed by authenticating user root 188.166.104.67 port 59628 [preauth]
Nov 28 16:47:37 compute-0 sshd-session[31070]: Connection closed by authenticating user root 188.166.104.67 port 41914 [preauth]
Nov 28 16:48:07 compute-0 sshd-session[31072]: Connection closed by authenticating user root 188.166.104.67 port 44564 [preauth]
Nov 28 16:48:37 compute-0 sshd-session[31074]: Connection closed by authenticating user root 188.166.104.67 port 37738 [preauth]
Nov 28 16:49:06 compute-0 sshd-session[31076]: Connection closed by authenticating user root 188.166.104.67 port 35288 [preauth]
Nov 28 16:49:35 compute-0 sshd-session[31078]: Connection closed by authenticating user root 188.166.104.67 port 41142 [preauth]
Nov 28 16:50:03 compute-0 sshd-session[31080]: Connection closed by authenticating user root 188.166.104.67 port 56378 [preauth]
Nov 28 16:50:22 compute-0 sshd-session[30036]: Received disconnect from 38.129.56.243 port 33416:11: disconnected by user
Nov 28 16:50:22 compute-0 sshd-session[30036]: Disconnected from user zuul 38.129.56.243 port 33416
Nov 28 16:50:22 compute-0 sshd-session[30033]: pam_unix(sshd:session): session closed for user zuul
Nov 28 16:50:22 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 28 16:50:22 compute-0 systemd[1]: session-7.scope: Consumed 4.918s CPU time.
Nov 28 16:50:22 compute-0 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Nov 28 16:50:22 compute-0 systemd-logind[788]: Removed session 7.
Nov 28 16:50:32 compute-0 sshd-session[31083]: Connection closed by authenticating user root 188.166.104.67 port 46490 [preauth]
Nov 28 16:51:00 compute-0 sshd-session[31086]: Connection closed by authenticating user root 188.166.104.67 port 51396 [preauth]
Nov 28 16:51:27 compute-0 sshd-session[31089]: Connection closed by authenticating user root 188.166.104.67 port 38542 [preauth]
Nov 28 16:51:54 compute-0 sshd-session[31091]: Connection closed by authenticating user root 188.166.104.67 port 36940 [preauth]
Nov 28 16:52:22 compute-0 sshd-session[31093]: Connection closed by authenticating user root 188.166.104.67 port 58188 [preauth]
Nov 28 16:52:50 compute-0 sshd-session[31095]: Connection closed by authenticating user root 188.166.104.67 port 39456 [preauth]
Nov 28 16:53:18 compute-0 sshd-session[31097]: Connection closed by authenticating user root 188.166.104.67 port 60056 [preauth]
Nov 28 16:53:47 compute-0 sshd-session[31099]: Connection closed by authenticating user root 188.166.104.67 port 44596 [preauth]
Nov 28 16:54:17 compute-0 sshd-session[31101]: Connection closed by authenticating user root 188.166.104.67 port 49984 [preauth]
Nov 28 16:54:45 compute-0 sshd-session[31103]: Connection closed by authenticating user root 188.166.104.67 port 50530 [preauth]
Nov 28 16:55:13 compute-0 sshd-session[31105]: Connection closed by authenticating user root 188.166.104.67 port 44872 [preauth]
Nov 28 16:55:41 compute-0 sshd-session[31107]: Connection closed by authenticating user root 188.166.104.67 port 46944 [preauth]
Nov 28 16:56:08 compute-0 sshd-session[31109]: Connection closed by authenticating user root 188.166.104.67 port 56958 [preauth]
Nov 28 16:56:36 compute-0 sshd-session[31111]: Connection closed by authenticating user root 188.166.104.67 port 42258 [preauth]
Nov 28 16:57:03 compute-0 sshd-session[31113]: Connection closed by authenticating user root 188.166.104.67 port 38504 [preauth]
Nov 28 16:57:29 compute-0 sshd-session[31115]: Invalid user admin from 188.166.104.67 port 43776
Nov 28 16:57:30 compute-0 sshd-session[31115]: Connection closed by invalid user admin 188.166.104.67 port 43776 [preauth]
Nov 28 16:57:57 compute-0 sshd-session[31118]: Invalid user admin from 188.166.104.67 port 40026
Nov 28 16:57:57 compute-0 sshd-session[31118]: Connection closed by invalid user admin 188.166.104.67 port 40026 [preauth]
Nov 28 16:58:24 compute-0 sshd-session[31121]: Invalid user admin from 188.166.104.67 port 52468
Nov 28 16:58:24 compute-0 sshd-session[31121]: Connection closed by invalid user admin 188.166.104.67 port 52468 [preauth]
Nov 28 16:58:51 compute-0 sshd-session[31123]: Invalid user admin from 188.166.104.67 port 36732
Nov 28 16:58:51 compute-0 sshd-session[31123]: Connection closed by invalid user admin 188.166.104.67 port 36732 [preauth]
Nov 28 16:59:18 compute-0 sshd-session[31125]: Invalid user admin from 188.166.104.67 port 32960
Nov 28 16:59:19 compute-0 sshd-session[31125]: Connection closed by invalid user admin 188.166.104.67 port 32960 [preauth]
Nov 28 16:59:46 compute-0 sshd-session[31127]: Invalid user admin from 188.166.104.67 port 34832
Nov 28 16:59:46 compute-0 sshd-session[31127]: Connection closed by invalid user admin 188.166.104.67 port 34832 [preauth]
Nov 28 17:00:13 compute-0 sshd-session[31129]: Invalid user admin from 188.166.104.67 port 48220
Nov 28 17:00:13 compute-0 sshd-session[31129]: Connection closed by invalid user admin 188.166.104.67 port 48220 [preauth]
Nov 28 17:00:42 compute-0 sshd-session[31131]: Invalid user admin from 188.166.104.67 port 59830
Nov 28 17:00:42 compute-0 sshd-session[31131]: Connection closed by invalid user admin 188.166.104.67 port 59830 [preauth]
Nov 28 17:01:01 compute-0 CROND[31134]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 17:01:01 compute-0 run-parts[31137]: (/etc/cron.hourly) starting 0anacron
Nov 28 17:01:01 compute-0 anacron[31145]: Anacron started on 2025-11-28
Nov 28 17:01:01 compute-0 anacron[31145]: Will run job `cron.daily' in 26 min.
Nov 28 17:01:01 compute-0 anacron[31145]: Will run job `cron.weekly' in 46 min.
Nov 28 17:01:01 compute-0 anacron[31145]: Will run job `cron.monthly' in 66 min.
Nov 28 17:01:01 compute-0 anacron[31145]: Jobs will be executed sequentially
Nov 28 17:01:01 compute-0 run-parts[31147]: (/etc/cron.hourly) finished 0anacron
Nov 28 17:01:01 compute-0 CROND[31133]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 17:01:06 compute-0 sshd-session[31148]: Accepted publickey for zuul from 192.168.122.30 port 55586 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:01:06 compute-0 systemd-logind[788]: New session 8 of user zuul.
Nov 28 17:01:06 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 28 17:01:06 compute-0 sshd-session[31148]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:01:07 compute-0 python3.9[31301]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:01:09 compute-0 sudo[31482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmmxqxvkacawjpzpihpkkuwvkzoxdzih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349268.5490756-49-122078137646355/AnsiballZ_command.py'
Nov 28 17:01:09 compute-0 sudo[31482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:09 compute-0 python3.9[31484]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:01:09 compute-0 sshd-session[31431]: Invalid user admin from 188.166.104.67 port 48388
Nov 28 17:01:09 compute-0 sshd-session[31431]: Connection closed by invalid user admin 188.166.104.67 port 48388 [preauth]
Nov 28 17:01:17 compute-0 sudo[31482]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:17 compute-0 sshd-session[31151]: Connection closed by 192.168.122.30 port 55586
Nov 28 17:01:17 compute-0 sshd-session[31148]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:01:17 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 28 17:01:17 compute-0 systemd[1]: session-8.scope: Consumed 8.347s CPU time.
Nov 28 17:01:17 compute-0 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Nov 28 17:01:17 compute-0 systemd-logind[788]: Removed session 8.
Nov 28 17:01:23 compute-0 sshd-session[31541]: Accepted publickey for zuul from 192.168.122.30 port 35972 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:01:23 compute-0 systemd-logind[788]: New session 9 of user zuul.
Nov 28 17:01:23 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 28 17:01:23 compute-0 sshd-session[31541]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:01:24 compute-0 python3.9[31694]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:01:25 compute-0 sshd-session[31544]: Connection closed by 192.168.122.30 port 35972
Nov 28 17:01:25 compute-0 sshd-session[31541]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:01:25 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 28 17:01:25 compute-0 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Nov 28 17:01:25 compute-0 systemd-logind[788]: Removed session 9.
Nov 28 17:01:36 compute-0 sshd-session[31722]: Invalid user admin from 188.166.104.67 port 37794
Nov 28 17:01:36 compute-0 sshd-session[31722]: Connection closed by invalid user admin 188.166.104.67 port 37794 [preauth]
Nov 28 17:01:43 compute-0 sshd-session[31724]: Accepted publickey for zuul from 192.168.122.30 port 52854 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:01:43 compute-0 systemd-logind[788]: New session 10 of user zuul.
Nov 28 17:01:43 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 28 17:01:43 compute-0 sshd-session[31724]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:01:43 compute-0 python3.9[31877]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 17:01:45 compute-0 python3.9[32051]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:01:46 compute-0 sudo[32201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqcopyadlnhxkqhitaxahbykxewgpbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349305.6545994-74-80250581593102/AnsiballZ_command.py'
Nov 28 17:01:46 compute-0 sudo[32201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:46 compute-0 python3.9[32203]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:01:46 compute-0 sudo[32201]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:47 compute-0 sudo[32354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwxnwklsxlwjnnqmodbyzbjrlupnwvqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349306.9689934-98-122644727153500/AnsiballZ_stat.py'
Nov 28 17:01:47 compute-0 sudo[32354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:47 compute-0 python3.9[32356]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:01:47 compute-0 sudo[32354]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:48 compute-0 sudo[32506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycolcceddlizlipfipqjonogfcwvatwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349307.9806318-114-241661233723512/AnsiballZ_file.py'
Nov 28 17:01:48 compute-0 sudo[32506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:48 compute-0 python3.9[32508]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:01:48 compute-0 sudo[32506]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:49 compute-0 sudo[32658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chrheombxwzceqqlfplpzludletiurha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349308.7951386-130-95522844941588/AnsiballZ_stat.py'
Nov 28 17:01:49 compute-0 sudo[32658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:49 compute-0 python3.9[32660]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:01:49 compute-0 sudo[32658]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:49 compute-0 sudo[32781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsapshtncajdtorfbfczakbuflzjsapi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349308.7951386-130-95522844941588/AnsiballZ_copy.py'
Nov 28 17:01:49 compute-0 sudo[32781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:50 compute-0 python3.9[32783]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349308.7951386-130-95522844941588/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:01:50 compute-0 sudo[32781]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:50 compute-0 sudo[32933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maforzrfzsijhclkzhxtkyzbdbfnbfbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349310.3362067-160-76836896660265/AnsiballZ_setup.py'
Nov 28 17:01:50 compute-0 sudo[32933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:51 compute-0 python3.9[32935]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:01:51 compute-0 sudo[32933]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:51 compute-0 sudo[33089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghahhurfawonnvgavavrgvdqfotgepqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349311.4259124-176-128186880818814/AnsiballZ_file.py'
Nov 28 17:01:51 compute-0 sudo[33089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:51 compute-0 python3.9[33091]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:01:51 compute-0 sudo[33089]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:52 compute-0 sudo[33241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpzerxswcglbscjflzvmnpxnwufszfht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349312.1567008-194-13918978194491/AnsiballZ_file.py'
Nov 28 17:01:52 compute-0 sudo[33241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:01:52 compute-0 python3.9[33243]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:01:52 compute-0 sudo[33241]: pam_unix(sudo:session): session closed for user root
Nov 28 17:01:53 compute-0 python3.9[33393]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:01:57 compute-0 python3.9[33646]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:01:58 compute-0 python3.9[33796]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:01:59 compute-0 python3.9[33950]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:02:00 compute-0 sudo[34106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-broohqgsrldzgbydtwkxlsexvkwyoziu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349320.4081678-290-242602863156204/AnsiballZ_setup.py'
Nov 28 17:02:00 compute-0 sudo[34106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:02:01 compute-0 python3.9[34108]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:02:01 compute-0 sudo[34106]: pam_unix(sudo:session): session closed for user root
Nov 28 17:02:01 compute-0 sudo[34191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trklqqnlxqjxtdlivuuwxnoedbttxykq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349320.4081678-290-242602863156204/AnsiballZ_dnf.py'
Nov 28 17:02:01 compute-0 sudo[34191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:02:01 compute-0 python3.9[34193]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:02:03 compute-0 sshd-session[34222]: Invalid user admin from 188.166.104.67 port 40032
Nov 28 17:02:03 compute-0 sshd-session[34222]: Connection closed by invalid user admin 188.166.104.67 port 40032 [preauth]
Nov 28 17:02:30 compute-0 sshd-session[34318]: Invalid user admin from 188.166.104.67 port 57506
Nov 28 17:02:30 compute-0 sshd-session[34318]: Connection closed by invalid user admin 188.166.104.67 port 57506 [preauth]
Nov 28 17:02:52 compute-0 systemd[1]: Reloading.
Nov 28 17:02:52 compute-0 systemd-rc-local-generator[34394]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:02:52 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 28 17:02:52 compute-0 systemd[1]: Reloading.
Nov 28 17:02:52 compute-0 systemd-rc-local-generator[34432]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:02:52 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 28 17:02:52 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 28 17:02:52 compute-0 systemd[1]: Reloading.
Nov 28 17:02:52 compute-0 systemd-rc-local-generator[34472]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:02:53 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 28 17:02:53 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:02:53 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:02:53 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:02:56 compute-0 sshd-session[34502]: Invalid user admin from 188.166.104.67 port 41200
Nov 28 17:02:56 compute-0 sshd-session[34502]: Connection closed by invalid user admin 188.166.104.67 port 41200 [preauth]
Nov 28 17:03:22 compute-0 sshd-session[34599]: Invalid user admin from 188.166.104.67 port 53826
Nov 28 17:03:22 compute-0 sshd-session[34599]: Connection closed by invalid user admin 188.166.104.67 port 53826 [preauth]
Nov 28 17:03:48 compute-0 sshd-session[34702]: Invalid user admin from 188.166.104.67 port 34830
Nov 28 17:03:48 compute-0 sshd-session[34702]: Connection closed by invalid user admin 188.166.104.67 port 34830 [preauth]
Nov 28 17:03:58 compute-0 kernel: SELinux:  Converting 2719 SID table entries...
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:03:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:03:58 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 28 17:03:58 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:03:58 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:03:58 compute-0 systemd[1]: Reloading.
Nov 28 17:03:58 compute-0 systemd-rc-local-generator[34820]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:03:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:03:59 compute-0 sudo[34191]: pam_unix(sudo:session): session closed for user root
Nov 28 17:03:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:03:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:03:59 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.207s CPU time.
Nov 28 17:03:59 compute-0 systemd[1]: run-rf82ab6b2831c40cd9aadd95ad1e6033d.service: Deactivated successfully.
Nov 28 17:04:07 compute-0 sudo[35730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avafvxavgyavnukbjoooblnerajtctih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349447.6019936-314-120196346459357/AnsiballZ_command.py'
Nov 28 17:04:07 compute-0 sudo[35730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:08 compute-0 python3.9[35732]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:04:09 compute-0 sudo[35730]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:10 compute-0 sudo[36011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpmnohilvoqzowtxswfadrnsnjsfuflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349449.42792-330-24167211578114/AnsiballZ_selinux.py'
Nov 28 17:04:10 compute-0 sudo[36011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:10 compute-0 python3.9[36013]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 28 17:04:10 compute-0 sudo[36011]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:11 compute-0 sudo[36163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxgzublvotwnymkjrznoefxpjmjsvmyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349450.7311337-352-92427634603204/AnsiballZ_command.py'
Nov 28 17:04:11 compute-0 sudo[36163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:11 compute-0 python3.9[36165]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 28 17:04:13 compute-0 sudo[36163]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:14 compute-0 sudo[36316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsuvgovinkwkmgauhxdkoktjcedniywv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349453.7288258-368-118018839375337/AnsiballZ_file.py'
Nov 28 17:04:14 compute-0 sudo[36316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:14 compute-0 python3.9[36318]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:04:14 compute-0 sudo[36316]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:15 compute-0 sshd-session[36343]: Invalid user admin from 188.166.104.67 port 34114
Nov 28 17:04:15 compute-0 sshd-session[36343]: Connection closed by invalid user admin 188.166.104.67 port 34114 [preauth]
Nov 28 17:04:15 compute-0 sudo[36470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thidhkbzxqssnxyidrbkrsumtuhtnzro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349454.994557-384-275052714718239/AnsiballZ_mount.py'
Nov 28 17:04:15 compute-0 sudo[36470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:18 compute-0 python3.9[36472]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 28 17:04:18 compute-0 sudo[36470]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:19 compute-0 sudo[36623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgsyucyrrigcljhefhvgtoxramakvigw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349459.3957338-440-129075777668447/AnsiballZ_file.py'
Nov 28 17:04:19 compute-0 sudo[36623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:20 compute-0 python3.9[36625]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:04:20 compute-0 sudo[36623]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:22 compute-0 sudo[36775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvgwlvczrommiwyqsxkpjyrfuotgioz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349462.093502-456-45782115323758/AnsiballZ_stat.py'
Nov 28 17:04:22 compute-0 sudo[36775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:24 compute-0 python3.9[36777]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:04:24 compute-0 sudo[36775]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:25 compute-0 sudo[36898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kltmnvqxszrkmovlsydntzudqkafaxgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349462.093502-456-45782115323758/AnsiballZ_copy.py'
Nov 28 17:04:25 compute-0 sudo[36898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:25 compute-0 python3.9[36900]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349462.093502-456-45782115323758/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:04:25 compute-0 sudo[36898]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:30 compute-0 sudo[37050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deimrroiboeiyjvifhatynkyiccetols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349470.1951215-504-257942025025835/AnsiballZ_stat.py'
Nov 28 17:04:30 compute-0 sudo[37050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:30 compute-0 python3.9[37052]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:04:30 compute-0 sudo[37050]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:31 compute-0 sudo[37202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-astsoriyxjabxhnmirvyvtamovlleevn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349470.8610072-520-210766519107099/AnsiballZ_command.py'
Nov 28 17:04:31 compute-0 sudo[37202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:31 compute-0 python3.9[37204]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:04:31 compute-0 sudo[37202]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:32 compute-0 sudo[37355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doxnypmhjktqyifmgrjrgpzkqgcqfaxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349472.036844-536-154568977361311/AnsiballZ_file.py'
Nov 28 17:04:32 compute-0 sudo[37355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:32 compute-0 python3.9[37357]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:04:32 compute-0 sudo[37355]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:33 compute-0 sudo[37507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rirfurkdwqlwegofptywikosgmmezwvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349473.187139-558-127174136599632/AnsiballZ_getent.py'
Nov 28 17:04:33 compute-0 sudo[37507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:33 compute-0 python3.9[37509]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 28 17:04:33 compute-0 sudo[37507]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:33 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:04:33 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:04:34 compute-0 sudo[37661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abdefrlpplxpnudcubttqldrfvsowwbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349473.9928334-574-232779874704514/AnsiballZ_group.py'
Nov 28 17:04:34 compute-0 sudo[37661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:34 compute-0 python3.9[37663]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:04:34 compute-0 groupadd[37664]: group added to /etc/group: name=qemu, GID=107
Nov 28 17:04:34 compute-0 groupadd[37664]: group added to /etc/gshadow: name=qemu
Nov 28 17:04:34 compute-0 groupadd[37664]: new group: name=qemu, GID=107
Nov 28 17:04:34 compute-0 sudo[37661]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:35 compute-0 sudo[37819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkcfovoybzskgvjgkugtqmkmcryuxqbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349474.9138815-590-223045457420594/AnsiballZ_user.py'
Nov 28 17:04:35 compute-0 sudo[37819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:35 compute-0 python3.9[37821]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 17:04:35 compute-0 useradd[37823]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 17:04:35 compute-0 sudo[37819]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:36 compute-0 sudo[37979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjefdnxljudbuqeleaklqstygckghduz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349475.9916773-606-48328165240515/AnsiballZ_getent.py'
Nov 28 17:04:36 compute-0 sudo[37979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:36 compute-0 python3.9[37981]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 28 17:04:36 compute-0 sudo[37979]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:37 compute-0 sudo[38132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buzitflrbopaiirmboompoljronjmels ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349476.8245723-622-163846417479955/AnsiballZ_group.py'
Nov 28 17:04:37 compute-0 sudo[38132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:37 compute-0 python3.9[38134]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:04:37 compute-0 groupadd[38135]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 28 17:04:37 compute-0 groupadd[38135]: group added to /etc/gshadow: name=hugetlbfs
Nov 28 17:04:37 compute-0 groupadd[38135]: new group: name=hugetlbfs, GID=42477
Nov 28 17:04:37 compute-0 sudo[38132]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:38 compute-0 sudo[38290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbyizmdgualqvyuuuturgwldekyhupbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349477.8470654-640-116998675147293/AnsiballZ_file.py'
Nov 28 17:04:38 compute-0 sudo[38290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:38 compute-0 python3.9[38292]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 28 17:04:38 compute-0 sudo[38290]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:38 compute-0 sudo[38442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eciazijhsjggmyievqgboeoishadseyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349478.7041714-662-45867392608729/AnsiballZ_dnf.py'
Nov 28 17:04:38 compute-0 sudo[38442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:39 compute-0 python3.9[38444]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:04:41 compute-0 sshd-session[38446]: Invalid user admin from 188.166.104.67 port 58218
Nov 28 17:04:41 compute-0 sudo[38442]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:41 compute-0 sshd-session[38446]: Connection closed by invalid user admin 188.166.104.67 port 58218 [preauth]
Nov 28 17:04:42 compute-0 sudo[38597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvpvzpqbxhvmcxopxxtozshgdbaxsxcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349482.0337691-678-262046228421786/AnsiballZ_file.py'
Nov 28 17:04:42 compute-0 sudo[38597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:42 compute-0 python3.9[38599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:04:42 compute-0 sudo[38597]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:43 compute-0 sudo[38749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nczskmzhyedfehenzjxivabksjazqttu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349482.733912-694-118147081010992/AnsiballZ_stat.py'
Nov 28 17:04:43 compute-0 sudo[38749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:43 compute-0 python3.9[38751]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:04:43 compute-0 sudo[38749]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:43 compute-0 sudo[38872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmlhdtepugpfdorjwknwsfwtblnpmrzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349482.733912-694-118147081010992/AnsiballZ_copy.py'
Nov 28 17:04:43 compute-0 sudo[38872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:43 compute-0 python3.9[38874]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349482.733912-694-118147081010992/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:04:43 compute-0 sudo[38872]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:44 compute-0 sudo[39024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjgatumxhbyinrrnpldtgbufylyscvvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349483.9778605-724-268942308825188/AnsiballZ_systemd.py'
Nov 28 17:04:44 compute-0 sudo[39024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:44 compute-0 python3.9[39026]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:04:45 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 28 17:04:45 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 28 17:04:45 compute-0 kernel: Bridge firewalling registered
Nov 28 17:04:45 compute-0 systemd-modules-load[39030]: Inserted module 'br_netfilter'
Nov 28 17:04:45 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 28 17:04:45 compute-0 sudo[39024]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:45 compute-0 sudo[39183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpsvidjzwvsafkemcfqfxoickvxjsxsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349485.3415208-740-270525236706160/AnsiballZ_stat.py'
Nov 28 17:04:45 compute-0 sudo[39183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:45 compute-0 python3.9[39185]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:04:45 compute-0 sudo[39183]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:46 compute-0 sudo[39306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlmwymnggwgizkppgcwbdxavabvovcck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349485.3415208-740-270525236706160/AnsiballZ_copy.py'
Nov 28 17:04:46 compute-0 sudo[39306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:46 compute-0 python3.9[39308]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349485.3415208-740-270525236706160/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:04:46 compute-0 sudo[39306]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:47 compute-0 sudo[39458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzfadvkercodwlqpvktmmommhjqhylzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349487.3046458-776-163208943149033/AnsiballZ_dnf.py'
Nov 28 17:04:47 compute-0 sudo[39458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:47 compute-0 python3.9[39460]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:04:52 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:04:52 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:04:52 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:04:52 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:04:52 compute-0 systemd[1]: Reloading.
Nov 28 17:04:52 compute-0 systemd-rc-local-generator[39516]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:04:52 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:04:53 compute-0 sudo[39458]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:54 compute-0 irqbalance[780]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 28 17:04:54 compute-0 irqbalance[780]: IRQ 26 affinity is now unmanaged
Nov 28 17:04:54 compute-0 irqbalance[780]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 28 17:04:54 compute-0 irqbalance[780]: IRQ 27 affinity is now unmanaged
Nov 28 17:04:54 compute-0 python3.9[41200]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:04:55 compute-0 python3.9[42525]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 17:04:55 compute-0 python3.9[43384]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:04:56 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:04:56 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:04:56 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.414s CPU time.
Nov 28 17:04:56 compute-0 systemd[1]: run-re753629c69514f94b2117ea943f7faa0.service: Deactivated successfully.
Nov 28 17:04:57 compute-0 sudo[43618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtfjiplxnxkdceqqhtbnnynaxmzmuulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349496.936235-854-230129802937355/AnsiballZ_command.py'
Nov 28 17:04:57 compute-0 sudo[43618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:57 compute-0 python3.9[43620]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:04:57 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 17:04:57 compute-0 systemd[1]: Starting Authorization Manager...
Nov 28 17:04:57 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 17:04:57 compute-0 polkitd[43837]: Started polkitd version 0.117
Nov 28 17:04:57 compute-0 polkitd[43837]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 17:04:57 compute-0 polkitd[43837]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 17:04:57 compute-0 polkitd[43837]: Finished loading, compiling and executing 2 rules
Nov 28 17:04:57 compute-0 systemd[1]: Started Authorization Manager.
Nov 28 17:04:57 compute-0 polkitd[43837]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 28 17:04:58 compute-0 sudo[43618]: pam_unix(sudo:session): session closed for user root
Nov 28 17:04:58 compute-0 sudo[44005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iokowruckassxzyrfcdmcuhqyivsjrgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349498.426206-872-253798074649701/AnsiballZ_systemd.py'
Nov 28 17:04:58 compute-0 sudo[44005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:04:58 compute-0 python3.9[44007]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:04:59 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 17:04:59 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 28 17:04:59 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 17:04:59 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 17:04:59 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 17:04:59 compute-0 sudo[44005]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:00 compute-0 python3.9[44168]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 17:05:03 compute-0 sudo[44318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmrymhqdpgmvszecqsacbxftcsokmdec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349503.1088505-986-275125928423261/AnsiballZ_systemd.py'
Nov 28 17:05:03 compute-0 sudo[44318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:03 compute-0 python3.9[44320]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:05:03 compute-0 systemd[1]: Reloading.
Nov 28 17:05:03 compute-0 systemd-rc-local-generator[44349]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:05:04 compute-0 sudo[44318]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:04 compute-0 sudo[44506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpzqvnhhwnzjjmaltltumkguhhncfmfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349504.4197397-986-266488516799816/AnsiballZ_systemd.py'
Nov 28 17:05:04 compute-0 sudo[44506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:04 compute-0 python3.9[44508]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:05:05 compute-0 systemd[1]: Reloading.
Nov 28 17:05:05 compute-0 systemd-rc-local-generator[44537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:05:05 compute-0 sudo[44506]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:06 compute-0 sudo[44695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcjjirspcvkqxonlaxxiftyymcqpzhis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349505.867771-1018-171178216043930/AnsiballZ_command.py'
Nov 28 17:05:06 compute-0 sudo[44695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:06 compute-0 python3.9[44697]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:06 compute-0 sudo[44695]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:06 compute-0 sudo[44848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqwibwuwzfdhjmqmxiubkutchuzzovgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349506.643438-1034-55678224129529/AnsiballZ_command.py'
Nov 28 17:05:06 compute-0 sudo[44848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:07 compute-0 python3.9[44850]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:07 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 28 17:05:07 compute-0 sudo[44848]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:07 compute-0 sudo[45003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaemalgxkuwujftatehfjgkpmlqtaleg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349507.4022167-1050-54527633218888/AnsiballZ_command.py'
Nov 28 17:05:07 compute-0 sudo[45003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:07 compute-0 sshd-session[44928]: Invalid user admin from 188.166.104.67 port 38048
Nov 28 17:05:07 compute-0 python3.9[45005]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:07 compute-0 sshd-session[44928]: Connection closed by invalid user admin 188.166.104.67 port 38048 [preauth]
Nov 28 17:05:09 compute-0 sudo[45003]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:09 compute-0 sudo[45165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hueahyooihfqxpcqawolanhialyhelzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349509.642292-1066-151288479476339/AnsiballZ_command.py'
Nov 28 17:05:09 compute-0 sudo[45165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:10 compute-0 python3.9[45167]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:10 compute-0 sudo[45165]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:10 compute-0 sudo[45318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfzatfhdmfifmmnedgwebcawsbudjeth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349510.368728-1082-177974613509434/AnsiballZ_systemd.py'
Nov 28 17:05:10 compute-0 sudo[45318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:11 compute-0 python3.9[45320]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:05:11 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 17:05:11 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 28 17:05:11 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 28 17:05:11 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 28 17:05:11 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 17:05:11 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 28 17:05:11 compute-0 sudo[45318]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:12 compute-0 sshd-session[31727]: Connection closed by 192.168.122.30 port 52854
Nov 28 17:05:12 compute-0 sshd-session[31724]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:05:12 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 28 17:05:12 compute-0 systemd[1]: session-10.scope: Consumed 2min 17.383s CPU time.
Nov 28 17:05:12 compute-0 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Nov 28 17:05:12 compute-0 systemd-logind[788]: Removed session 10.
Nov 28 17:05:18 compute-0 sshd-session[45350]: Accepted publickey for zuul from 192.168.122.30 port 34204 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:05:18 compute-0 systemd-logind[788]: New session 11 of user zuul.
Nov 28 17:05:18 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 28 17:05:18 compute-0 sshd-session[45350]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:05:19 compute-0 python3.9[45503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:05:21 compute-0 python3.9[45657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:05:23 compute-0 sudo[45811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqylnbnxyovzlhptlbrwmkgudqbrktwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349522.9720063-85-45837891547322/AnsiballZ_command.py'
Nov 28 17:05:23 compute-0 sudo[45811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:23 compute-0 python3.9[45813]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:23 compute-0 sudo[45811]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:24 compute-0 python3.9[45964]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:05:25 compute-0 sudo[46118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrnrunqssoorkrullujobenloyzfnrzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349525.2979794-125-136973752724523/AnsiballZ_setup.py'
Nov 28 17:05:25 compute-0 sudo[46118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:25 compute-0 python3.9[46120]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:05:26 compute-0 sudo[46118]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:26 compute-0 sudo[46202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gstpimjvemgqihdbsozwjrktaulnqlsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349525.2979794-125-136973752724523/AnsiballZ_dnf.py'
Nov 28 17:05:26 compute-0 sudo[46202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:26 compute-0 python3.9[46204]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:05:28 compute-0 sudo[46202]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:29 compute-0 sudo[46355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wczjulhayryquddxikdksbbcpqujbefd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349528.991872-149-83900038387891/AnsiballZ_setup.py'
Nov 28 17:05:29 compute-0 sudo[46355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:29 compute-0 python3.9[46357]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:05:29 compute-0 sudo[46355]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:31 compute-0 sudo[46526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuadijdplfyvvdlimdenarkjdoicglst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349530.5219264-171-31896520359505/AnsiballZ_file.py'
Nov 28 17:05:31 compute-0 sudo[46526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:31 compute-0 python3.9[46528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:05:31 compute-0 sudo[46526]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:31 compute-0 sudo[46678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbeydqqeqyhumqfyyughrrxkfzckeane ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349531.4320333-187-57993318455715/AnsiballZ_command.py'
Nov 28 17:05:31 compute-0 sudo[46678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:31 compute-0 python3.9[46680]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:05:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3931683669-merged.mount: Deactivated successfully.
Nov 28 17:05:32 compute-0 podman[46681]: 2025-11-28 17:05:32.032744797 +0000 UTC m=+0.070482325 system refresh
Nov 28 17:05:32 compute-0 sudo[46678]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:32 compute-0 sudo[46842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txujqnpamtomletifsetawhxvwofuvrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349532.2369013-203-80605456785099/AnsiballZ_stat.py'
Nov 28 17:05:32 compute-0 sudo[46842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:32 compute-0 python3.9[46844]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:05:32 compute-0 sudo[46842]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:05:33 compute-0 sudo[46965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftfhlqdgbfkivaapjuwpzvnwuhejxzzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349532.2369013-203-80605456785099/AnsiballZ_copy.py'
Nov 28 17:05:33 compute-0 sudo[46965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:33 compute-0 python3.9[46967]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349532.2369013-203-80605456785099/.source.json follow=False _original_basename=podman_network_config.j2 checksum=6c81efb682fd38d4dc1a840bfdc0be6937c35d5b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:05:33 compute-0 sudo[46965]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:34 compute-0 sudo[47117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzrfqyvixibqnbwdsphzsfuhabwxutwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349534.0936122-233-112124438877150/AnsiballZ_stat.py'
Nov 28 17:05:34 compute-0 sudo[47117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:34 compute-0 python3.9[47119]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:05:34 compute-0 sudo[47117]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:34 compute-0 sudo[47242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjcefhtzixwvgrenpnbupyyjquqvlcrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349534.0936122-233-112124438877150/AnsiballZ_copy.py'
Nov 28 17:05:34 compute-0 sudo[47242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:34 compute-0 sshd-session[47120]: Invalid user admin from 188.166.104.67 port 54756
Nov 28 17:05:35 compute-0 sshd-session[47120]: Connection closed by invalid user admin 188.166.104.67 port 54756 [preauth]
Nov 28 17:05:35 compute-0 python3.9[47244]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349534.0936122-233-112124438877150/.source.conf follow=False _original_basename=registries.conf.j2 checksum=e054e42fc917865162376c34713b3d5516074d23 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:05:35 compute-0 sudo[47242]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:36 compute-0 sudo[47394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmyrgqigucajdufzrjgjldjzqbedxnlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349535.5736275-265-135797042021455/AnsiballZ_ini_file.py'
Nov 28 17:05:36 compute-0 sudo[47394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:36 compute-0 python3.9[47396]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:05:36 compute-0 sudo[47394]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:37 compute-0 sudo[47546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvszmnfqdmyuzjdprzcvcaslfkjbflnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349536.6830354-265-254948614437156/AnsiballZ_ini_file.py'
Nov 28 17:05:37 compute-0 sudo[47546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:37 compute-0 python3.9[47548]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:05:37 compute-0 sudo[47546]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:38 compute-0 sudo[47698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnwbntxhkdypnvnznpdtbuofqlologkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349537.6202922-265-49156228780218/AnsiballZ_ini_file.py'
Nov 28 17:05:38 compute-0 sudo[47698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:38 compute-0 python3.9[47700]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:05:38 compute-0 sudo[47698]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:38 compute-0 sudo[47850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmabqiwhunbfzwrauftxogertfbmzrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349538.3948278-265-25374073397483/AnsiballZ_ini_file.py'
Nov 28 17:05:38 compute-0 sudo[47850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:38 compute-0 python3.9[47852]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:05:38 compute-0 sudo[47850]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:40 compute-0 python3.9[48002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:05:41 compute-0 sudo[48154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sntkjwpfzqxjyeeaszmbohejfmdaipuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349540.5562484-345-125999727766870/AnsiballZ_dnf.py'
Nov 28 17:05:41 compute-0 sudo[48154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:41 compute-0 python3.9[48156]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:42 compute-0 sudo[48154]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:43 compute-0 sudo[48307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpxoijeepfcywupprefykiuiuqiwagxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349543.2144103-361-210341562756776/AnsiballZ_dnf.py'
Nov 28 17:05:43 compute-0 sudo[48307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:43 compute-0 python3.9[48309]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:45 compute-0 sudo[48307]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:46 compute-0 sudo[48467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnjrphvlfskksvbvnchfwmgwoptvimn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349546.448471-381-34093763049270/AnsiballZ_dnf.py'
Nov 28 17:05:46 compute-0 sudo[48467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:46 compute-0 python3.9[48469]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:48 compute-0 sudo[48467]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:49 compute-0 sudo[48620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btcevehmmzlwuvqgqhmcvzsosuhphoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349548.7577016-399-123646741743246/AnsiballZ_dnf.py'
Nov 28 17:05:49 compute-0 sudo[48620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:49 compute-0 python3.9[48622]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:50 compute-0 sudo[48620]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:51 compute-0 sudo[48773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duxfofioupdulxqfownngtecweufgwxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349551.26336-421-132649000910875/AnsiballZ_dnf.py'
Nov 28 17:05:51 compute-0 sudo[48773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:51 compute-0 python3.9[48775]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:53 compute-0 sudo[48773]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:54 compute-0 sudo[48929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzllbhpjxpbyiffdzgxmdbeglglevsok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349553.9826376-437-166590923575056/AnsiballZ_dnf.py'
Nov 28 17:05:54 compute-0 sudo[48929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:54 compute-0 python3.9[48931]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:57 compute-0 sudo[48929]: pam_unix(sudo:session): session closed for user root
Nov 28 17:05:57 compute-0 sudo[49098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyjsqwtdltpkikrfsatpwhtokvnkbccq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349557.7033083-455-165836101773107/AnsiballZ_dnf.py'
Nov 28 17:05:57 compute-0 sudo[49098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:05:58 compute-0 python3.9[49100]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:05:59 compute-0 sudo[49098]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:00 compute-0 sudo[49251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avanqgtmgbqkfjldfqkgglzbvatoedcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349560.0832305-473-203467318081584/AnsiballZ_dnf.py'
Nov 28 17:06:00 compute-0 sudo[49251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:00 compute-0 python3.9[49253]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:06:01 compute-0 sshd-session[49255]: Invalid user admin from 188.166.104.67 port 36076
Nov 28 17:06:01 compute-0 sshd-session[49255]: Connection closed by invalid user admin 188.166.104.67 port 36076 [preauth]
Nov 28 17:06:14 compute-0 sudo[49251]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:15 compute-0 sudo[49590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtywqpuyaqanklqikhikudaeskokedfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349575.0212016-491-164809147061964/AnsiballZ_dnf.py'
Nov 28 17:06:15 compute-0 sudo[49590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:15 compute-0 python3.9[49592]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:06:17 compute-0 sudo[49590]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:19 compute-0 sudo[49746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjhdrtbtumdrsoazphzelwxytprghbke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349577.9652433-513-13091836166445/AnsiballZ_file.py'
Nov 28 17:06:19 compute-0 sudo[49746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:19 compute-0 python3.9[49748]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:06:19 compute-0 sudo[49746]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:20 compute-0 sudo[49921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkyakezwrhsdkkldgclqfmtnvttgfuhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349579.7225378-529-278992333154349/AnsiballZ_stat.py'
Nov 28 17:06:20 compute-0 sudo[49921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:20 compute-0 python3.9[49923]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:06:20 compute-0 sudo[49921]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:20 compute-0 sudo[50044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrqweomzuvavxlzsrgrgshwokpsasyxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349579.7225378-529-278992333154349/AnsiballZ_copy.py'
Nov 28 17:06:20 compute-0 sudo[50044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:20 compute-0 python3.9[50046]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764349579.7225378-529-278992333154349/.source.json _original_basename=.hd2_fjba follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:06:20 compute-0 sudo[50044]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:21 compute-0 sudo[50196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdgkvupiambopghbfknadhxkaazrqyqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349581.3217292-565-138828847042104/AnsiballZ_podman_image.py'
Nov 28 17:06:21 compute-0 sudo[50196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:22 compute-0 python3.9[50198]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:06:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2761884778-lower\x2dmapped.mount: Deactivated successfully.
Nov 28 17:06:28 compute-0 sshd-session[50290]: Invalid user ubuntu from 188.166.104.67 port 57022
Nov 28 17:06:29 compute-0 sshd-session[50290]: Connection closed by invalid user ubuntu 188.166.104.67 port 57022 [preauth]
Nov 28 17:06:30 compute-0 podman[50210]: 2025-11-28 17:06:30.236387549 +0000 UTC m=+8.127091879 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 17:06:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:30 compute-0 sudo[50196]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:31 compute-0 sudo[50505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdexpfqusvrhvbdptypfwxcjubssuqib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349591.1736317-587-96807897162966/AnsiballZ_podman_image.py'
Nov 28 17:06:31 compute-0 sudo[50505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:31 compute-0 python3.9[50507]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:06:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:47 compute-0 podman[50519]: 2025-11-28 17:06:47.046235352 +0000 UTC m=+15.354333639 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:06:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:47 compute-0 sudo[50505]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:50 compute-0 sudo[50816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgakgzpofqgslbnulweahfjsaggbmiml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349610.2062855-607-130853029142413/AnsiballZ_podman_image.py'
Nov 28 17:06:50 compute-0 sudo[50816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:50 compute-0 python3.9[50818]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:06:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:52 compute-0 podman[50830]: 2025-11-28 17:06:52.062852552 +0000 UTC m=+1.290284956 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 17:06:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:06:52 compute-0 sudo[50816]: pam_unix(sudo:session): session closed for user root
Nov 28 17:06:53 compute-0 sudo[51064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwteblbidalnkpnwmihlelbxpxdlqxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349612.722969-625-69286401282074/AnsiballZ_podman_image.py'
Nov 28 17:06:53 compute-0 sudo[51064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:06:53 compute-0 python3.9[51066]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:06:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:07 compute-0 podman[51078]: 2025-11-28 17:07:07.344291576 +0000 UTC m=+14.049120557 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 17:07:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:07 compute-0 sudo[51064]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:11 compute-0 sudo[51335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwaunxtkozineafnubxkpalevdnvseyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349631.3849382-647-68304607559847/AnsiballZ_podman_image.py'
Nov 28 17:07:11 compute-0 sudo[51335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:15 compute-0 python3.9[51337]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:07:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:19 compute-0 podman[51350]: 2025-11-28 17:07:19.550615975 +0000 UTC m=+3.557108101 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 17:07:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:19 compute-0 sudo[51335]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:20 compute-0 sudo[51603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rodklebzxmtvfvwzrowkwxjodwwpffph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349639.965423-647-97840620610585/AnsiballZ_podman_image.py'
Nov 28 17:07:20 compute-0 sudo[51603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:20 compute-0 python3.9[51605]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 17:07:25 compute-0 podman[51616]: 2025-11-28 17:07:25.233165731 +0000 UTC m=+4.735354705 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 28 17:07:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:07:25 compute-0 sudo[51603]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:26 compute-0 sshd-session[45353]: Connection closed by 192.168.122.30 port 34204
Nov 28 17:07:26 compute-0 sshd-session[45350]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:07:26 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 28 17:07:26 compute-0 systemd[1]: session-11.scope: Consumed 1min 59.524s CPU time.
Nov 28 17:07:26 compute-0 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Nov 28 17:07:26 compute-0 systemd-logind[788]: Removed session 11.
Nov 28 17:07:31 compute-0 sshd-session[51763]: Accepted publickey for zuul from 192.168.122.30 port 60314 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:07:31 compute-0 systemd-logind[788]: New session 12 of user zuul.
Nov 28 17:07:31 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 28 17:07:31 compute-0 sshd-session[51763]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:07:32 compute-0 python3.9[51916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:07:33 compute-0 sudo[52070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgquhkntsizowauwhzkqdruhabnsaazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349653.5810544-57-207237701373343/AnsiballZ_getent.py'
Nov 28 17:07:33 compute-0 sudo[52070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:34 compute-0 python3.9[52072]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 28 17:07:34 compute-0 sudo[52070]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:34 compute-0 sudo[52223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekmghpjiullrfbdqmxfsnqtoozjblqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349654.3753436-73-119443429765002/AnsiballZ_group.py'
Nov 28 17:07:34 compute-0 sudo[52223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:34 compute-0 python3.9[52225]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:07:35 compute-0 groupadd[52226]: group added to /etc/group: name=openvswitch, GID=42476
Nov 28 17:07:35 compute-0 groupadd[52226]: group added to /etc/gshadow: name=openvswitch
Nov 28 17:07:35 compute-0 groupadd[52226]: new group: name=openvswitch, GID=42476
Nov 28 17:07:35 compute-0 sudo[52223]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:37 compute-0 sudo[52381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-silpmoptqynnraopsvqepbvnjtdvawxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349657.0734935-89-139444682308274/AnsiballZ_user.py'
Nov 28 17:07:37 compute-0 sudo[52381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:37 compute-0 python3.9[52383]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 17:07:38 compute-0 useradd[52385]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 17:07:38 compute-0 useradd[52385]: add 'openvswitch' to group 'hugetlbfs'
Nov 28 17:07:38 compute-0 useradd[52385]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 28 17:07:38 compute-0 sudo[52381]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:39 compute-0 sudo[52541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gseieatwojshisbizedlbqkbnozrcmcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349659.2259724-109-87622695535796/AnsiballZ_setup.py'
Nov 28 17:07:39 compute-0 sudo[52541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:39 compute-0 python3.9[52543]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:07:40 compute-0 sudo[52541]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:40 compute-0 sudo[52625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awugwlcddooffljazjhotdxcxmgwmzjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349659.2259724-109-87622695535796/AnsiballZ_dnf.py'
Nov 28 17:07:40 compute-0 sudo[52625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:40 compute-0 python3.9[52627]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:07:42 compute-0 sudo[52625]: pam_unix(sudo:session): session closed for user root
Nov 28 17:07:43 compute-0 sudo[52787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdaptuorajhisprfgtlfaestahbvetoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349663.5532358-137-652837560739/AnsiballZ_dnf.py'
Nov 28 17:07:43 compute-0 sudo[52787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:07:44 compute-0 python3.9[52789]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:07:57 compute-0 kernel: SELinux:  Converting 2732 SID table entries...
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:07:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:07:58 compute-0 groupadd[52812]: group added to /etc/group: name=unbound, GID=993
Nov 28 17:07:58 compute-0 groupadd[52812]: group added to /etc/gshadow: name=unbound
Nov 28 17:07:58 compute-0 groupadd[52812]: new group: name=unbound, GID=993
Nov 28 17:07:58 compute-0 useradd[52819]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 28 17:07:58 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 28 17:07:58 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 28 17:07:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:07:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:07:59 compute-0 systemd[1]: Reloading.
Nov 28 17:07:59 compute-0 systemd-rc-local-generator[53316]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:07:59 compute-0 systemd-sysv-generator[53319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:08:00 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:08:01 compute-0 sudo[52787]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:08:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:08:03 compute-0 systemd[1]: run-rca9391564a2e4e27b53ce9c3e350d648.service: Deactivated successfully.
Nov 28 17:08:03 compute-0 sudo[53887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ellelcnvetivwmsdcnfzxmvfbinyzrit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349682.868906-153-259646767872113/AnsiballZ_systemd.py'
Nov 28 17:08:03 compute-0 sudo[53887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:04 compute-0 python3.9[53889]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:08:04 compute-0 systemd[1]: Reloading.
Nov 28 17:08:04 compute-0 systemd-rc-local-generator[53919]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:08:04 compute-0 systemd-sysv-generator[53922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:08:04 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 28 17:08:04 compute-0 chown[53930]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 28 17:08:04 compute-0 ovs-ctl[53935]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 28 17:08:04 compute-0 ovs-ctl[53935]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 28 17:08:04 compute-0 ovs-ctl[53935]: Starting ovsdb-server [  OK  ]
Nov 28 17:08:04 compute-0 ovs-vsctl[53984]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 28 17:08:04 compute-0 ovs-vsctl[54000]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2ad2dbac-a967-40fb-b69b-7c374c5f8e9d\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 28 17:08:04 compute-0 ovs-ctl[53935]: Configuring Open vSwitch system IDs [  OK  ]
Nov 28 17:08:04 compute-0 ovs-vsctl[54010]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 28 17:08:04 compute-0 ovs-ctl[53935]: Enabling remote OVSDB managers [  OK  ]
Nov 28 17:08:04 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 28 17:08:04 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 28 17:08:04 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 28 17:08:04 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 28 17:08:05 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 28 17:08:05 compute-0 ovs-ctl[54054]: Inserting openvswitch module [  OK  ]
Nov 28 17:08:05 compute-0 ovs-ctl[54023]: Starting ovs-vswitchd [  OK  ]
Nov 28 17:08:05 compute-0 ovs-vsctl[54072]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 28 17:08:05 compute-0 ovs-ctl[54023]: Enabling remote OVSDB managers [  OK  ]
Nov 28 17:08:05 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 28 17:08:05 compute-0 systemd[1]: Starting Open vSwitch...
Nov 28 17:08:05 compute-0 systemd[1]: Finished Open vSwitch.
Nov 28 17:08:05 compute-0 sudo[53887]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:06 compute-0 python3.9[54224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:08:07 compute-0 sudo[54374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whgqeszdepencgcrurbdpefitspbfaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349687.1567724-189-171048556547266/AnsiballZ_sefcontext.py'
Nov 28 17:08:07 compute-0 sudo[54374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:07 compute-0 python3.9[54376]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 28 17:08:09 compute-0 kernel: SELinux:  Converting 2746 SID table entries...
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:08:09 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:08:09 compute-0 sudo[54374]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:11 compute-0 python3.9[54531]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:08:12 compute-0 sudo[54687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcpzejasxabqavjfdmwlimguowsqiljs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349691.7217238-225-33462918912679/AnsiballZ_dnf.py'
Nov 28 17:08:12 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 28 17:08:12 compute-0 sudo[54687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:12 compute-0 python3.9[54689]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:08:13 compute-0 sudo[54687]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:14 compute-0 sudo[54840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcenkjcppyfdbfesxlkskbnkfneevysi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349694.0883572-241-152007283867583/AnsiballZ_command.py'
Nov 28 17:08:14 compute-0 sudo[54840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:14 compute-0 python3.9[54842]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:08:15 compute-0 sudo[54840]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:16 compute-0 sudo[55127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwjlsunexnbntrcpcqlexzbbwfezyiyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349695.9105687-257-60121152820146/AnsiballZ_file.py'
Nov 28 17:08:16 compute-0 sudo[55127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:16 compute-0 python3.9[55129]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 17:08:16 compute-0 sudo[55127]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:17 compute-0 python3.9[55279]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:08:18 compute-0 sudo[55431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-houezfvvnzjxehifqorujylyajslhmmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349698.1278436-289-207072854161102/AnsiballZ_dnf.py'
Nov 28 17:08:18 compute-0 sudo[55431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:18 compute-0 python3.9[55433]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:08:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:08:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:08:20 compute-0 systemd[1]: Reloading.
Nov 28 17:08:20 compute-0 systemd-rc-local-generator[55471]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:08:20 compute-0 systemd-sysv-generator[55475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:08:20 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:08:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:08:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:08:21 compute-0 systemd[1]: run-r1c6f57b8f6844da893207172f4875018.service: Deactivated successfully.
Nov 28 17:08:21 compute-0 sudo[55431]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:22 compute-0 sudo[55750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmhmqfkfmvpdkjnjgoothkhdoferextl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349702.5142508-305-2961385240196/AnsiballZ_systemd.py'
Nov 28 17:08:22 compute-0 sudo[55750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:23 compute-0 python3.9[55752]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:08:23 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 17:08:23 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 28 17:08:23 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 28 17:08:23 compute-0 systemd[1]: Stopping Network Manager...
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.1928] caught SIGTERM, shutting down normally.
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.1949] dhcp4 (eth0): canceled DHCP transaction
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.1949] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.1950] dhcp4 (eth0): state changed no lease
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.1952] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 17:08:23 compute-0 NetworkManager[7203]: <info>  [1764349703.2021] exiting (success)
Nov 28 17:08:23 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 17:08:23 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 17:08:23 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 17:08:23 compute-0 systemd[1]: Stopped Network Manager.
Nov 28 17:08:23 compute-0 systemd[1]: NetworkManager.service: Consumed 15.112s CPU time, 4.1M memory peak, read 0B from disk, written 21.0K to disk.
Nov 28 17:08:23 compute-0 systemd[1]: Starting Network Manager...
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.2569] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:554a2c9a-1114-4202-a0a9-67957e011662)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.2570] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.2626] manager[0x559591492090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 17:08:23 compute-0 systemd[1]: Starting Hostname Service...
Nov 28 17:08:23 compute-0 systemd[1]: Started Hostname Service.
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3632] hostname: hostname: using hostnamed
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3633] hostname: static hostname changed from (none) to "compute-0"
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3638] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3644] manager[0x559591492090]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3645] manager[0x559591492090]: rfkill: WWAN hardware radio set enabled
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3664] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3671] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3672] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3672] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3673] manager: Networking is enabled by state file
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3675] settings: Loaded settings plugin: keyfile (internal)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3678] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3701] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3708] dhcp: init: Using DHCP client 'internal'
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3710] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3715] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3720] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3728] device (lo): Activation: starting connection 'lo' (2464996e-5b9c-4662-8d02-714ba4ef3d59)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3733] device (eth0): carrier: link connected
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3737] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3742] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3742] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3748] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3753] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3758] device (eth1): carrier: link connected
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3762] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3766] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (3602a993-b6bf-5e94-b722-2dfc0f2c6254) (indicated)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3766] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3771] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3776] device (eth1): Activation: starting connection 'ci-private-network' (3602a993-b6bf-5e94-b722-2dfc0f2c6254)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3782] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 17:08:23 compute-0 systemd[1]: Started Network Manager.
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3799] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3802] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3803] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3805] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3807] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3809] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3810] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3812] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3816] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3818] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3827] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3840] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3878] dhcp4 (eth0): state changed new lease, address=38.129.56.212
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3884] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 17:08:23 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3952] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3956] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3956] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3958] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3962] device (lo): Activation: successful, device activated.
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3966] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3968] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3970] device (eth1): Activation: successful, device activated.
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3975] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3976] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3979] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3981] device (eth0): Activation: successful, device activated.
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3984] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 17:08:23 compute-0 NetworkManager[55763]: <info>  [1764349703.3985] manager: startup complete
Nov 28 17:08:23 compute-0 sudo[55750]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:23 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 28 17:08:24 compute-0 sudo[55976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfbeirwdhosbjjhfpwzdmfvpupunfvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349703.7117004-321-44005143905154/AnsiballZ_dnf.py'
Nov 28 17:08:24 compute-0 sudo[55976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:24 compute-0 python3.9[55978]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:08:28 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:08:28 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:08:28 compute-0 systemd[1]: Reloading.
Nov 28 17:08:29 compute-0 systemd-sysv-generator[56034]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:08:29 compute-0 systemd-rc-local-generator[56031]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:08:29 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:08:29 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:08:29 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:08:29 compute-0 systemd[1]: run-rd6cb7fca2d914377ad51fbc0e5da0e12.service: Deactivated successfully.
Nov 28 17:08:30 compute-0 sudo[55976]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:32 compute-0 sudo[56439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrzapvrtihufohynyqaqiedlyzlhnlqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349711.7845817-345-250517055496286/AnsiballZ_stat.py'
Nov 28 17:08:32 compute-0 sudo[56439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:32 compute-0 python3.9[56441]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:08:32 compute-0 sudo[56439]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:33 compute-0 sudo[56591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbrnenplynnfoxdbrakjjosjtkmmsiwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349712.6255045-363-161789560301426/AnsiballZ_ini_file.py'
Nov 28 17:08:33 compute-0 sudo[56591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:33 compute-0 python3.9[56593]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:33 compute-0 sudo[56591]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:33 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 17:08:33 compute-0 sudo[56745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lllzkmxhjuawwszcffflrpcklwyltweg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349713.618461-383-9982395816988/AnsiballZ_ini_file.py'
Nov 28 17:08:33 compute-0 sudo[56745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:34 compute-0 python3.9[56747]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:34 compute-0 sudo[56745]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:34 compute-0 sudo[56897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubwemjquyrezqnnyatmhabfdgwpjrwkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349714.268624-383-198744730068336/AnsiballZ_ini_file.py'
Nov 28 17:08:34 compute-0 sudo[56897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:34 compute-0 python3.9[56899]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:34 compute-0 sudo[56897]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:35 compute-0 sudo[57049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xirsnxcibknhzdlqfoduitsnuyjwznfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349715.0912745-413-224353422103608/AnsiballZ_ini_file.py'
Nov 28 17:08:35 compute-0 sudo[57049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:35 compute-0 python3.9[57051]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:35 compute-0 sudo[57049]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:36 compute-0 sudo[57201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yghlmufcseqxggkviatbqjhppbuocgbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349715.7362368-413-85715974284654/AnsiballZ_ini_file.py'
Nov 28 17:08:36 compute-0 sudo[57201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:36 compute-0 python3.9[57203]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:36 compute-0 sudo[57201]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:36 compute-0 sudo[57353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxwdnmbkpnjphhubhxxnwsjlvwvjufaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349716.615554-443-105671758353757/AnsiballZ_stat.py'
Nov 28 17:08:36 compute-0 sudo[57353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:37 compute-0 python3.9[57355]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:08:37 compute-0 sudo[57353]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:37 compute-0 sudo[57476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmlqjofjmaqrgvruvkdznkvidokhossi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349716.615554-443-105671758353757/AnsiballZ_copy.py'
Nov 28 17:08:37 compute-0 sudo[57476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:37 compute-0 python3.9[57478]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349716.615554-443-105671758353757/.source _original_basename=.usyja2sg follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:37 compute-0 sudo[57476]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:38 compute-0 sudo[57628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhvvamzsvyumdiqmftziwgaubwcvlfnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349718.13473-473-230094097548845/AnsiballZ_file.py'
Nov 28 17:08:38 compute-0 sudo[57628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:38 compute-0 python3.9[57630]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:38 compute-0 sudo[57628]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:39 compute-0 sudo[57780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pimysazpadsrnkyimdhtsnjqxnceybxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349718.8574061-489-187304792151149/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 28 17:08:39 compute-0 sudo[57780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:39 compute-0 python3.9[57782]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 28 17:08:39 compute-0 sudo[57780]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:40 compute-0 sudo[57932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohtidvemgkkuttxyuylzuwiiexxevaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349720.2303972-507-20990536890900/AnsiballZ_file.py'
Nov 28 17:08:40 compute-0 sudo[57932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:40 compute-0 python3.9[57934]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:40 compute-0 sudo[57932]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:41 compute-0 sudo[58084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luxiqbhzddtxkmkiqkmnjbktcvttfphx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349721.0844727-527-206701566037891/AnsiballZ_stat.py'
Nov 28 17:08:41 compute-0 sudo[58084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:41 compute-0 sudo[58084]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:41 compute-0 sudo[58207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysplhikmpnhzxnlwqsbxwyproljdjvnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349721.0844727-527-206701566037891/AnsiballZ_copy.py'
Nov 28 17:08:41 compute-0 sudo[58207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:42 compute-0 sudo[58207]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:43 compute-0 sudo[58359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuxquuzkhqfmwhsnmnddubspsmtqxpyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349722.6185532-557-191928690759197/AnsiballZ_slurp.py'
Nov 28 17:08:43 compute-0 sudo[58359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:43 compute-0 python3.9[58361]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 28 17:08:43 compute-0 sudo[58359]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:44 compute-0 sudo[58534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkjxrdefvrculucczjriffhfcklrvzr ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349723.5923998-575-193523012583153/async_wrapper.py j328489732811 300 /home/zuul/.ansible/tmp/ansible-tmp-1764349723.5923998-575-193523012583153/AnsiballZ_edpm_os_net_config.py _'
Nov 28 17:08:44 compute-0 sudo[58534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:44 compute-0 ansible-async_wrapper.py[58536]: Invoked with j328489732811 300 /home/zuul/.ansible/tmp/ansible-tmp-1764349723.5923998-575-193523012583153/AnsiballZ_edpm_os_net_config.py _
Nov 28 17:08:44 compute-0 ansible-async_wrapper.py[58539]: Starting module and watcher
Nov 28 17:08:44 compute-0 ansible-async_wrapper.py[58539]: Start watching 58540 (300)
Nov 28 17:08:44 compute-0 ansible-async_wrapper.py[58540]: Start module (58540)
Nov 28 17:08:44 compute-0 ansible-async_wrapper.py[58536]: Return async_wrapper task started.
Nov 28 17:08:44 compute-0 sudo[58534]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:44 compute-0 python3.9[58541]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 28 17:08:45 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 28 17:08:45 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 28 17:08:45 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 28 17:08:45 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 28 17:08:45 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4502] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4519] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4955] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4956] audit: op="connection-add" uuid="830adfbc-757f-4c4e-b255-ea2e3d8bb60f" name="br-ex-br" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4969] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4970] audit: op="connection-add" uuid="b7187e85-718a-4672-8a49-07430a3b04cf" name="br-ex-port" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4981] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4982] audit: op="connection-add" uuid="a909861f-e6d5-458e-bd67-5067c6c10439" name="eth1-port" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4992] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.4993] audit: op="connection-add" uuid="f6cc1cb1-20a5-495c-adf2-6d399d86d5b4" name="vlan20-port" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5002] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5003] audit: op="connection-add" uuid="9d770d66-b56d-48ee-80d7-9ec9b354a017" name="vlan21-port" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5013] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5014] audit: op="connection-add" uuid="87024f12-390e-4acb-8c3b-d0c4c54179c9" name="vlan22-port" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5034] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5047] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5048] audit: op="connection-add" uuid="0a41e368-5f63-4829-8561-83337dab17e6" name="br-ex-if" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5121] audit: op="connection-update" uuid="3602a993-b6bf-5e94-b722-2dfc0f2c6254" name="ci-private-network" args="ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.method,ipv4.routing-rules,ipv4.routes,ovs-interface.type,connection.slave-type,connection.port-type,connection.controller,connection.master,connection.timestamp,ipv6.dns,ipv6.addresses,ipv6.method,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.routes,ovs-external-ids.data" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5135] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5137] audit: op="connection-add" uuid="6ceb1035-2978-4d81-84b8-b2224c6ca7d8" name="vlan20-if" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5150] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5152] audit: op="connection-add" uuid="1ef02a52-c40c-4654-8eec-98fa31d4bbdf" name="vlan21-if" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5165] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5166] audit: op="connection-add" uuid="f1ee04d3-1599-4b3d-977c-2a16f8de6ce5" name="vlan22-if" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5177] audit: op="connection-delete" uuid="3805b732-7d96-3b33-9481-4cdfa6af1193" name="Wired connection 1" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5189] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5200] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5204] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (830adfbc-757f-4c4e-b255-ea2e3d8bb60f)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5204] audit: op="connection-activate" uuid="830adfbc-757f-4c4e-b255-ea2e3d8bb60f" name="br-ex-br" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5206] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5212] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5215] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b7187e85-718a-4672-8a49-07430a3b04cf)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5216] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5221] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5224] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (a909861f-e6d5-458e-bd67-5067c6c10439)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5225] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5230] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5233] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f6cc1cb1-20a5-495c-adf2-6d399d86d5b4)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5234] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5239] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5242] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (9d770d66-b56d-48ee-80d7-9ec9b354a017)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5244] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5249] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5252] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (87024f12-390e-4acb-8c3b-d0c4c54179c9)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5252] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5255] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5256] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5261] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5264] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5268] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (0a41e368-5f63-4829-8561-83337dab17e6)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5268] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5271] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5272] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5274] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5275] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5284] device (eth1): disconnecting for new activation request.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5284] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5286] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5288] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5290] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5292] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5295] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5299] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (6ceb1035-2978-4d81-84b8-b2224c6ca7d8)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5299] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5301] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5303] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5304] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5306] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5310] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5313] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (1ef02a52-c40c-4654-8eec-98fa31d4bbdf)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5314] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5316] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5318] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5319] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5321] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5325] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5329] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (f1ee04d3-1599-4b3d-977c-2a16f8de6ce5)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5329] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5332] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5334] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5334] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5335] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5345] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5346] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5349] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5351] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5356] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5359] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5362] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5364] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5379] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5385] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5390] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5395] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5398] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 systemd-udevd[58548]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:08:46 compute-0 kernel: Timeout policy base is empty
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5404] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5409] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5412] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5415] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5420] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5425] dhcp4 (eth0): canceled DHCP transaction
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5426] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5426] dhcp4 (eth0): state changed no lease
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5429] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5444] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5448] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58542 uid=0 result="fail" reason="Device is not activated"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5454] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5461] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5469] device (eth1): disconnecting for new activation request.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5470] audit: op="connection-activate" uuid="3602a993-b6bf-5e94-b722-2dfc0f2c6254" name="ci-private-network" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5473] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5476] dhcp4 (eth0): state changed new lease, address=38.129.56.212
Nov 28 17:08:46 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5576] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58542 uid=0 result="success"
Nov 28 17:08:46 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5649] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5760] device (eth1): Activation: starting connection 'ci-private-network' (3602a993-b6bf-5e94-b722-2dfc0f2c6254)
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5767] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5778] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5784] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5792] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5800] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5807] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5809] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5811] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5813] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5816] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5836] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 kernel: br-ex: entered promiscuous mode
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5843] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5849] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5853] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5859] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5863] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5869] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5874] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5879] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5884] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5890] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5903] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5910] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 kernel: vlan22: entered promiscuous mode
Nov 28 17:08:46 compute-0 systemd-udevd[58546]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5946] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5956] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5963] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5970] device (eth1): Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.5979] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 kernel: vlan21: entered promiscuous mode
Nov 28 17:08:46 compute-0 kernel: vlan20: entered promiscuous mode
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6062] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6065] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6078] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6085] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6099] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6121] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6138] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6151] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6153] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6170] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6181] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6181] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6186] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6191] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6206] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6246] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6248] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 17:08:46 compute-0 NetworkManager[55763]: <info>  [1764349726.6254] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 17:08:47 compute-0 NetworkManager[55763]: <info>  [1764349727.7421] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58542 uid=0 result="success"
Nov 28 17:08:47 compute-0 NetworkManager[55763]: <info>  [1764349727.9339] checkpoint[0x559591468950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 28 17:08:47 compute-0 NetworkManager[55763]: <info>  [1764349727.9341] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.1686] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.1697] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.3336] audit: op="networking-control" arg="global-dns-configuration" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.3365] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.3393] audit: op="networking-control" arg="global-dns-configuration" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.3411] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.4709] checkpoint[0x559591468a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 28 17:08:48 compute-0 NetworkManager[55763]: <info>  [1764349728.4713] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58542 uid=0 result="success"
Nov 28 17:08:48 compute-0 ansible-async_wrapper.py[58540]: Module complete (58540)
Nov 28 17:08:48 compute-0 sudo[58878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxfugajaglphfflkaqusjzrkghydsgdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349727.7653904-575-209609802995990/AnsiballZ_async_status.py'
Nov 28 17:08:48 compute-0 sudo[58878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:48 compute-0 python3.9[58880]: ansible-ansible.legacy.async_status Invoked with jid=j328489732811.58536 mode=status _async_dir=/root/.ansible_async
Nov 28 17:08:48 compute-0 sudo[58878]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:49 compute-0 sudo[58978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgjvljadswrpxdrkvafrzshzrqdyldkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349727.7653904-575-209609802995990/AnsiballZ_async_status.py'
Nov 28 17:08:49 compute-0 sudo[58978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:49 compute-0 python3.9[58980]: ansible-ansible.legacy.async_status Invoked with jid=j328489732811.58536 mode=cleanup _async_dir=/root/.ansible_async
Nov 28 17:08:49 compute-0 sudo[58978]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:49 compute-0 ansible-async_wrapper.py[58539]: Done in kid B.
Nov 28 17:08:53 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 17:08:53 compute-0 sudo[59133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rovmrehqgnlsbrfvnyqzchcbaqjvecfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349733.1918788-624-193624327678015/AnsiballZ_stat.py'
Nov 28 17:08:53 compute-0 sudo[59133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:53 compute-0 python3.9[59135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:08:53 compute-0 sudo[59133]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:54 compute-0 sudo[59257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtrjqakhliaugwbxwgabomjdptxzpqcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349733.1918788-624-193624327678015/AnsiballZ_copy.py'
Nov 28 17:08:54 compute-0 sudo[59257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:54 compute-0 python3.9[59259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349733.1918788-624-193624327678015/.source.returncode _original_basename=.3cg3lk3p follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:54 compute-0 sudo[59257]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:54 compute-0 sudo[59409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-padgmkrzmfzysiqaxlixahgjznjbajbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349734.5544589-656-214064356451878/AnsiballZ_stat.py'
Nov 28 17:08:54 compute-0 sudo[59409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:55 compute-0 python3.9[59411]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:08:55 compute-0 sudo[59409]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:55 compute-0 sudo[59533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-funnqdhggsjupzmynvxczyvvnuvrmgbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349734.5544589-656-214064356451878/AnsiballZ_copy.py'
Nov 28 17:08:55 compute-0 sudo[59533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:55 compute-0 python3.9[59535]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349734.5544589-656-214064356451878/.source.cfg _original_basename=.f2gvk5qc follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:08:55 compute-0 sudo[59533]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:56 compute-0 sudo[59685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxgstwrjmyczlmoxepqpqhxxsmxsjncn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349736.0040748-686-55085843811542/AnsiballZ_systemd.py'
Nov 28 17:08:56 compute-0 sudo[59685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:08:56 compute-0 python3.9[59687]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:08:56 compute-0 systemd[1]: Reloading Network Manager...
Nov 28 17:08:56 compute-0 NetworkManager[55763]: <info>  [1764349736.6445] audit: op="reload" arg="0" pid=59691 uid=0 result="success"
Nov 28 17:08:56 compute-0 NetworkManager[55763]: <info>  [1764349736.6455] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 28 17:08:56 compute-0 systemd[1]: Reloaded Network Manager.
Nov 28 17:08:56 compute-0 sudo[59685]: pam_unix(sudo:session): session closed for user root
Nov 28 17:08:57 compute-0 sshd-session[51766]: Connection closed by 192.168.122.30 port 60314
Nov 28 17:08:57 compute-0 sshd-session[51763]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:08:57 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 28 17:08:57 compute-0 systemd[1]: session-12.scope: Consumed 49.071s CPU time.
Nov 28 17:08:57 compute-0 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Nov 28 17:08:57 compute-0 systemd-logind[788]: Removed session 12.
Nov 28 17:09:02 compute-0 sshd-session[59722]: Accepted publickey for zuul from 192.168.122.30 port 49616 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:09:02 compute-0 systemd-logind[788]: New session 13 of user zuul.
Nov 28 17:09:02 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 28 17:09:02 compute-0 sshd-session[59722]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:09:03 compute-0 python3.9[59875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:09:04 compute-0 python3.9[60029]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:09:05 compute-0 python3.9[60219]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:09:06 compute-0 sshd-session[59725]: Connection closed by 192.168.122.30 port 49616
Nov 28 17:09:06 compute-0 sshd-session[59722]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:09:06 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 28 17:09:06 compute-0 systemd[1]: session-13.scope: Consumed 2.237s CPU time.
Nov 28 17:09:06 compute-0 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Nov 28 17:09:06 compute-0 systemd-logind[788]: Removed session 13.
Nov 28 17:09:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 17:09:11 compute-0 sshd-session[60249]: Accepted publickey for zuul from 192.168.122.30 port 53838 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:09:11 compute-0 systemd-logind[788]: New session 14 of user zuul.
Nov 28 17:09:11 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 28 17:09:11 compute-0 sshd-session[60249]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:09:12 compute-0 python3.9[60402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:09:13 compute-0 python3.9[60556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:09:14 compute-0 sudo[60711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hurdsxbuatmhynuhvblceliivupepgma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349754.2324293-65-29746612991835/AnsiballZ_setup.py'
Nov 28 17:09:14 compute-0 sudo[60711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:14 compute-0 python3.9[60713]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:09:15 compute-0 sudo[60711]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:15 compute-0 sudo[60795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqvdwwmmsxijmisbzcjqqznpwijkcjwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349754.2324293-65-29746612991835/AnsiballZ_dnf.py'
Nov 28 17:09:15 compute-0 sudo[60795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:15 compute-0 python3.9[60797]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:09:17 compute-0 sudo[60795]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:17 compute-0 sudo[60949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olbwuaxjykcycbrhevwnggcpxhxfifws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349757.5976527-89-171489663303552/AnsiballZ_setup.py'
Nov 28 17:09:17 compute-0 sudo[60949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:18 compute-0 python3.9[60951]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:09:18 compute-0 sudo[60949]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:19 compute-0 sudo[61140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukotznrisucqnharvnvjyegcscqoyswu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349758.9056985-111-45808398801729/AnsiballZ_file.py'
Nov 28 17:09:19 compute-0 sudo[61140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:19 compute-0 python3.9[61142]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:09:19 compute-0 sudo[61140]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:20 compute-0 sudo[61292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svtqfenhrqytyxtaadlkgexncwnnpvzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349759.7745693-127-71977913192507/AnsiballZ_command.py'
Nov 28 17:09:20 compute-0 sudo[61292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:20 compute-0 python3.9[61294]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:09:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:09:20 compute-0 sudo[61292]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:21 compute-0 sudo[61456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brleglfhzpybpxrwfdmdxaziyepxrnhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349760.7073953-143-71239722188447/AnsiballZ_stat.py'
Nov 28 17:09:21 compute-0 sudo[61456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:21 compute-0 python3.9[61458]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:09:21 compute-0 sudo[61456]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:21 compute-0 sudo[61534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnkhfokkvpugvmkdpudxmxazkszsfpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349760.7073953-143-71239722188447/AnsiballZ_file.py'
Nov 28 17:09:21 compute-0 sudo[61534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:21 compute-0 python3.9[61536]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:09:21 compute-0 sudo[61534]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:22 compute-0 sudo[61686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvpsszssizkenrxxxvqxdhiboqhmddv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349762.084936-167-39967680008473/AnsiballZ_stat.py'
Nov 28 17:09:22 compute-0 sudo[61686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:22 compute-0 python3.9[61688]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:09:22 compute-0 sudo[61686]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:22 compute-0 sudo[61764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iakvsegctyifzajfdsgqognpgcnzvlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349762.084936-167-39967680008473/AnsiballZ_file.py'
Nov 28 17:09:22 compute-0 sudo[61764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:22 compute-0 python3.9[61766]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:09:23 compute-0 sudo[61764]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:23 compute-0 sudo[61916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeelkjrbetovrhkjoakvsqigytcgaatb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349763.4146752-193-156955348935221/AnsiballZ_ini_file.py'
Nov 28 17:09:23 compute-0 sudo[61916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:24 compute-0 python3.9[61918]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:09:24 compute-0 sudo[61916]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:24 compute-0 sudo[62068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pepdycelsdubyrhtvnippdtxdshzynfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349764.1425016-193-59719669887056/AnsiballZ_ini_file.py'
Nov 28 17:09:24 compute-0 sudo[62068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:24 compute-0 python3.9[62070]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:09:24 compute-0 sudo[62068]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:25 compute-0 sudo[62220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvrvxbgdoeihojkbqqhryeajasiytltx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349764.7551937-193-67469976998951/AnsiballZ_ini_file.py'
Nov 28 17:09:25 compute-0 sudo[62220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:25 compute-0 python3.9[62222]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:09:25 compute-0 sudo[62220]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:25 compute-0 sudo[62372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlahfwoixtwseqairlwmlopgpcrtycpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349765.3639352-193-133977054726960/AnsiballZ_ini_file.py'
Nov 28 17:09:25 compute-0 sudo[62372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:25 compute-0 python3.9[62374]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:09:25 compute-0 sudo[62372]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:26 compute-0 sudo[62524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thvozdjpwwdfqigmnfugaaswueqlpazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349766.4097283-255-145453452415122/AnsiballZ_dnf.py'
Nov 28 17:09:26 compute-0 sudo[62524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:26 compute-0 python3.9[62526]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:09:28 compute-0 sudo[62524]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:29 compute-0 sudo[62677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdlgdyesgyplxytfgfpnqoibbhcboiky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349769.1333005-277-189548514874701/AnsiballZ_setup.py'
Nov 28 17:09:29 compute-0 sudo[62677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:29 compute-0 python3.9[62679]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:09:29 compute-0 sudo[62677]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:30 compute-0 sudo[62831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfrpmsvehiktfrpdgelyogwozodbntyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349770.0348234-293-185856458391949/AnsiballZ_stat.py'
Nov 28 17:09:30 compute-0 sudo[62831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:30 compute-0 python3.9[62833]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:09:30 compute-0 sudo[62831]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:31 compute-0 sudo[62983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwtonjurzhyaysnwctqrfbswvvjeveet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349770.8753676-311-221977989866221/AnsiballZ_stat.py'
Nov 28 17:09:31 compute-0 sudo[62983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:31 compute-0 python3.9[62985]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:09:31 compute-0 sudo[62983]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:32 compute-0 sudo[63135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhlvlseonbjssvbeymeoxucqvgtqirxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349771.7648528-331-251003897503717/AnsiballZ_command.py'
Nov 28 17:09:32 compute-0 sudo[63135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:32 compute-0 python3.9[63137]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:09:32 compute-0 sudo[63135]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:33 compute-0 sudo[63288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhsrvntyjdftephmgfzotnxqjbolryxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349772.676752-351-177103100679327/AnsiballZ_service_facts.py'
Nov 28 17:09:33 compute-0 sudo[63288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:33 compute-0 python3.9[63290]: ansible-service_facts Invoked
Nov 28 17:09:33 compute-0 network[63307]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:09:33 compute-0 network[63308]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:09:33 compute-0 network[63309]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:09:36 compute-0 sudo[63288]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:37 compute-0 sudo[63592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrlrwwgqzrgzfxlsskinwindhyxndzjy ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764349776.993707-381-259451609064125/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764349776.993707-381-259451609064125/args'
Nov 28 17:09:37 compute-0 sudo[63592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:37 compute-0 sudo[63592]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:38 compute-0 sudo[63759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmuvroqhhdibwvntqxwvvizbyzhiajew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349777.7623987-403-244765994513838/AnsiballZ_dnf.py'
Nov 28 17:09:38 compute-0 sudo[63759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:38 compute-0 python3.9[63761]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:09:40 compute-0 sudo[63759]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:41 compute-0 sudo[63912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfiwbwiskxtabracgrswqqbyacsvdreo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349780.8328295-429-131717357641290/AnsiballZ_package_facts.py'
Nov 28 17:09:41 compute-0 sudo[63912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:41 compute-0 python3.9[63914]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 28 17:09:41 compute-0 sudo[63912]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:43 compute-0 sudo[64064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axtzdmljtgpvixoghifmpfqrfoltjjrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349782.670746-449-143144600677152/AnsiballZ_stat.py'
Nov 28 17:09:43 compute-0 sudo[64064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:43 compute-0 python3.9[64066]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:09:43 compute-0 sudo[64064]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:43 compute-0 sudo[64189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlkrupceaeapanptorkhaqmkifiqoldu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349782.670746-449-143144600677152/AnsiballZ_copy.py'
Nov 28 17:09:43 compute-0 sudo[64189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:44 compute-0 python3.9[64191]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349782.670746-449-143144600677152/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:09:44 compute-0 sudo[64189]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:44 compute-0 sudo[64343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlkitgeawvchcqkbtwuilrwtsyjdivhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349784.440345-479-83529210052931/AnsiballZ_stat.py'
Nov 28 17:09:44 compute-0 sudo[64343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:44 compute-0 python3.9[64345]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:09:44 compute-0 sudo[64343]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:45 compute-0 sudo[64468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayjmkssnyhjdrmegldmbwlptfxbaucxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349784.440345-479-83529210052931/AnsiballZ_copy.py'
Nov 28 17:09:45 compute-0 sudo[64468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:45 compute-0 python3.9[64470]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349784.440345-479-83529210052931/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:09:45 compute-0 sudo[64468]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:46 compute-0 sudo[64622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oonjqyettpfpitjdjyypbfyremvrqrmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349786.330973-521-262647345637321/AnsiballZ_lineinfile.py'
Nov 28 17:09:46 compute-0 sudo[64622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:47 compute-0 python3.9[64624]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:09:47 compute-0 sudo[64622]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:48 compute-0 sudo[64776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnoyvvtkigqpbcpkcflgtjztyenlzeel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349787.8098867-551-137425681102609/AnsiballZ_setup.py'
Nov 28 17:09:48 compute-0 sudo[64776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:48 compute-0 python3.9[64778]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:09:48 compute-0 sudo[64776]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:49 compute-0 sudo[64860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qedbcmndqjpmondhlwnljqflulttermi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349787.8098867-551-137425681102609/AnsiballZ_systemd.py'
Nov 28 17:09:49 compute-0 sudo[64860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:49 compute-0 python3.9[64862]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:09:49 compute-0 sudo[64860]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:50 compute-0 sudo[65014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbevbcysjztufyjfxsjxmofocjeneqsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349790.0705502-583-261939136538724/AnsiballZ_setup.py'
Nov 28 17:09:50 compute-0 sudo[65014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:50 compute-0 python3.9[65016]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:09:50 compute-0 sudo[65014]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:51 compute-0 sudo[65098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirgafvpspdidojxncaqvplacouhtmju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349790.0705502-583-261939136538724/AnsiballZ_systemd.py'
Nov 28 17:09:51 compute-0 sudo[65098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:09:51 compute-0 python3.9[65100]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:09:51 compute-0 chronyd[792]: chronyd exiting
Nov 28 17:09:51 compute-0 systemd[1]: Stopping NTP client/server...
Nov 28 17:09:51 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 17:09:51 compute-0 systemd[1]: Stopped NTP client/server.
Nov 28 17:09:51 compute-0 systemd[1]: Starting NTP client/server...
Nov 28 17:09:51 compute-0 chronyd[65109]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 17:09:51 compute-0 chronyd[65109]: Frequency -28.704 +/- 0.290 ppm read from /var/lib/chrony/drift
Nov 28 17:09:51 compute-0 chronyd[65109]: Loaded seccomp filter (level 2)
Nov 28 17:09:51 compute-0 systemd[1]: Started NTP client/server.
Nov 28 17:09:51 compute-0 sudo[65098]: pam_unix(sudo:session): session closed for user root
Nov 28 17:09:52 compute-0 sshd-session[60252]: Connection closed by 192.168.122.30 port 53838
Nov 28 17:09:52 compute-0 sshd-session[60249]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:09:52 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 28 17:09:52 compute-0 systemd[1]: session-14.scope: Consumed 25.378s CPU time.
Nov 28 17:09:52 compute-0 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Nov 28 17:09:52 compute-0 systemd-logind[788]: Removed session 14.
Nov 28 17:09:58 compute-0 sshd-session[65135]: Accepted publickey for zuul from 192.168.122.30 port 38676 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:09:58 compute-0 systemd-logind[788]: New session 15 of user zuul.
Nov 28 17:09:58 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 28 17:09:58 compute-0 sshd-session[65135]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:09:59 compute-0 python3.9[65288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:10:00 compute-0 sudo[65442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acuqutrstwroycwegpouvmkjzqdczren ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349799.7159274-51-155637981707224/AnsiballZ_file.py'
Nov 28 17:10:00 compute-0 sudo[65442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:00 compute-0 python3.9[65444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:00 compute-0 sudo[65442]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:01 compute-0 sudo[65617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbdcvtvfpzjgvnirzwlwcumsspimnuhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349800.5745707-67-183331670614703/AnsiballZ_stat.py'
Nov 28 17:10:01 compute-0 sudo[65617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:01 compute-0 python3.9[65619]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:01 compute-0 sudo[65617]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:01 compute-0 sudo[65695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvsovaehhdzeefriexlchetwtxggumaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349800.5745707-67-183331670614703/AnsiballZ_file.py'
Nov 28 17:10:01 compute-0 sudo[65695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:01 compute-0 python3.9[65697]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.vl8ft8fo recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:01 compute-0 sudo[65695]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:02 compute-0 sudo[65847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrwueujujsqmnwjjgzogffdyyeinfbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349802.182189-107-190014577899449/AnsiballZ_stat.py'
Nov 28 17:10:02 compute-0 sudo[65847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:02 compute-0 python3.9[65849]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:02 compute-0 sudo[65847]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:03 compute-0 sudo[65970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppwaihwrknfxldkjdzbdmawwuzsuccsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349802.182189-107-190014577899449/AnsiballZ_copy.py'
Nov 28 17:10:03 compute-0 sudo[65970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:03 compute-0 python3.9[65972]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349802.182189-107-190014577899449/.source _original_basename=.mat6rtj5 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:03 compute-0 sudo[65970]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:04 compute-0 sudo[66122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anvozswspuijqdggleiqzapaotvuxvfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349803.6926556-139-12652899515818/AnsiballZ_file.py'
Nov 28 17:10:04 compute-0 sudo[66122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:04 compute-0 python3.9[66124]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:10:04 compute-0 sudo[66122]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:05 compute-0 sudo[66274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgawrmobxintzapqauqnaxdbsgwtngrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349804.7525947-155-218173248283642/AnsiballZ_stat.py'
Nov 28 17:10:05 compute-0 sudo[66274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:06 compute-0 python3.9[66276]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:06 compute-0 sudo[66274]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:06 compute-0 sudo[66397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmaookgbymkfhdotablvppjoajyqjwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349804.7525947-155-218173248283642/AnsiballZ_copy.py'
Nov 28 17:10:06 compute-0 sudo[66397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:06 compute-0 python3.9[66399]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349804.7525947-155-218173248283642/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:10:06 compute-0 sudo[66397]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:07 compute-0 sudo[66549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnzmmuruhlhntdiboahjxvwyxnvykrwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349806.807473-155-6162504466950/AnsiballZ_stat.py'
Nov 28 17:10:07 compute-0 sudo[66549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:07 compute-0 python3.9[66551]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:07 compute-0 sudo[66549]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:07 compute-0 sudo[66672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiiycelmqzggpggxpbdxbauusgpebhor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349806.807473-155-6162504466950/AnsiballZ_copy.py'
Nov 28 17:10:07 compute-0 sudo[66672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:07 compute-0 python3.9[66674]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349806.807473-155-6162504466950/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:10:07 compute-0 sudo[66672]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:08 compute-0 sudo[66824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijuactbjbgtlyjttcrkunemindsxjost ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349808.0308754-213-174883487983139/AnsiballZ_file.py'
Nov 28 17:10:08 compute-0 sudo[66824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:08 compute-0 python3.9[66826]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:08 compute-0 sudo[66824]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:09 compute-0 sudo[66976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqgchuhpgovrsfuweehmaueptprzrxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349808.8104603-229-267574454229953/AnsiballZ_stat.py'
Nov 28 17:10:09 compute-0 sudo[66976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:09 compute-0 python3.9[66978]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:09 compute-0 sudo[66976]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:09 compute-0 sudo[67099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onirvtqvpucqgovjnxhhimpmwaavhoqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349808.8104603-229-267574454229953/AnsiballZ_copy.py'
Nov 28 17:10:09 compute-0 sudo[67099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:09 compute-0 python3.9[67101]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349808.8104603-229-267574454229953/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:10 compute-0 sudo[67099]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:10 compute-0 sudo[67251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbwpbtflbyasitjlxqebjqiokcpzowod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349810.241733-259-81240088407056/AnsiballZ_stat.py'
Nov 28 17:10:10 compute-0 sudo[67251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:10 compute-0 python3.9[67253]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:10 compute-0 sudo[67251]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:11 compute-0 sudo[67374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvcgaqxesionjmkzohwsevuhidpdltcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349810.241733-259-81240088407056/AnsiballZ_copy.py'
Nov 28 17:10:11 compute-0 sudo[67374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:11 compute-0 python3.9[67376]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349810.241733-259-81240088407056/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:11 compute-0 sudo[67374]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:12 compute-0 sudo[67526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhwjoknlcqszdsjmdtmhhzelzksqgwjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349811.550896-289-270069602252593/AnsiballZ_systemd.py'
Nov 28 17:10:12 compute-0 sudo[67526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:12 compute-0 python3.9[67528]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:10:12 compute-0 systemd[1]: Reloading.
Nov 28 17:10:12 compute-0 systemd-sysv-generator[67556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:12 compute-0 systemd-rc-local-generator[67550]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:12 compute-0 systemd[1]: Reloading.
Nov 28 17:10:12 compute-0 systemd-rc-local-generator[67591]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:12 compute-0 systemd-sysv-generator[67595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:12 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 28 17:10:13 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 28 17:10:13 compute-0 sudo[67526]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:13 compute-0 sudo[67754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwnmqinubvrfkkgigoifloyanzdnpwls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349813.5351222-305-225597835265504/AnsiballZ_stat.py'
Nov 28 17:10:13 compute-0 sudo[67754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:14 compute-0 python3.9[67756]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:14 compute-0 sudo[67754]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:14 compute-0 sudo[67877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkabjwtprlyoouzzeprcfkguacljciga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349813.5351222-305-225597835265504/AnsiballZ_copy.py'
Nov 28 17:10:14 compute-0 sudo[67877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:14 compute-0 python3.9[67879]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349813.5351222-305-225597835265504/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:14 compute-0 sudo[67877]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:15 compute-0 sudo[68029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfxcisgsmgvbvjkkdlluxccomdtxhxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349815.0263925-335-216814456838431/AnsiballZ_stat.py'
Nov 28 17:10:15 compute-0 sudo[68029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:15 compute-0 python3.9[68031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:15 compute-0 sudo[68029]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:15 compute-0 sudo[68152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnyqnaorbqbwgymjphxmfgmozaqxqlif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349815.0263925-335-216814456838431/AnsiballZ_copy.py'
Nov 28 17:10:15 compute-0 sudo[68152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:16 compute-0 python3.9[68154]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349815.0263925-335-216814456838431/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:16 compute-0 sudo[68152]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:16 compute-0 sudo[68304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkkavrryrvtnnjazyebyxpznpmlyoizl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349816.6021936-365-65477036716663/AnsiballZ_systemd.py'
Nov 28 17:10:16 compute-0 sudo[68304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:17 compute-0 python3.9[68306]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:10:17 compute-0 systemd[1]: Reloading.
Nov 28 17:10:17 compute-0 systemd-sysv-generator[68339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:17 compute-0 systemd-rc-local-generator[68335]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:17 compute-0 systemd[1]: Reloading.
Nov 28 17:10:17 compute-0 systemd-rc-local-generator[68368]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:17 compute-0 systemd-sysv-generator[68373]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:17 compute-0 systemd[1]: Starting Create netns directory...
Nov 28 17:10:17 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 17:10:17 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 17:10:17 compute-0 systemd[1]: Finished Create netns directory.
Nov 28 17:10:17 compute-0 sudo[68304]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:18 compute-0 python3.9[68533]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:10:18 compute-0 network[68550]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:10:18 compute-0 network[68551]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:10:18 compute-0 network[68552]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:10:22 compute-0 sudo[68812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exfjesaewimkceboshiczlgrqdlkyrmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349822.4015613-397-71474456030765/AnsiballZ_systemd.py'
Nov 28 17:10:22 compute-0 sudo[68812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:23 compute-0 python3.9[68814]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:10:23 compute-0 systemd[1]: Reloading.
Nov 28 17:10:23 compute-0 systemd-rc-local-generator[68838]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:23 compute-0 systemd-sysv-generator[68845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:23 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 28 17:10:23 compute-0 iptables.init[68854]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 28 17:10:23 compute-0 iptables.init[68854]: iptables: Flushing firewall rules: [  OK  ]
Nov 28 17:10:23 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 28 17:10:23 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 28 17:10:23 compute-0 sudo[68812]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:24 compute-0 sudo[69049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quijoqsrziridvvhzsqpqmavmixauqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349823.954284-397-90700314388270/AnsiballZ_systemd.py'
Nov 28 17:10:24 compute-0 sudo[69049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:24 compute-0 python3.9[69051]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:10:24 compute-0 sudo[69049]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:25 compute-0 sudo[69203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnlrmkcjbhmjohncwbpzsazguijtktfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349824.8460274-429-144127116128224/AnsiballZ_systemd.py'
Nov 28 17:10:25 compute-0 sudo[69203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:25 compute-0 python3.9[69205]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:10:25 compute-0 systemd[1]: Reloading.
Nov 28 17:10:25 compute-0 systemd-rc-local-generator[69231]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:10:25 compute-0 systemd-sysv-generator[69234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:10:25 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 28 17:10:25 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 28 17:10:25 compute-0 sudo[69203]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:26 compute-0 sudo[69395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibvsyjlsirenregvrfaprujwemlwhzvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349825.9895804-445-111994653115882/AnsiballZ_command.py'
Nov 28 17:10:26 compute-0 sudo[69395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:26 compute-0 python3.9[69397]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:10:26 compute-0 sudo[69395]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:27 compute-0 sudo[69548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sitvmknfydagiuhbndejgolqeyrwwkzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349827.1473603-473-249892112726253/AnsiballZ_stat.py'
Nov 28 17:10:27 compute-0 sudo[69548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:27 compute-0 python3.9[69550]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:27 compute-0 sudo[69548]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:28 compute-0 sudo[69673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnhvevthacuvpklngrxxtdtvqnbqsvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349827.1473603-473-249892112726253/AnsiballZ_copy.py'
Nov 28 17:10:28 compute-0 sudo[69673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:28 compute-0 python3.9[69675]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349827.1473603-473-249892112726253/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:28 compute-0 sudo[69673]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:29 compute-0 sudo[69826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkidadadbiyoaxbipzxwsufsjtqmzdfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349829.0568943-503-16457268188/AnsiballZ_systemd.py'
Nov 28 17:10:29 compute-0 sudo[69826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:29 compute-0 python3.9[69828]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:10:29 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 28 17:10:29 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 28 17:10:29 compute-0 sshd[1006]: Received SIGHUP; restarting.
Nov 28 17:10:29 compute-0 sshd[1006]: Server listening on 0.0.0.0 port 22.
Nov 28 17:10:29 compute-0 sshd[1006]: Server listening on :: port 22.
Nov 28 17:10:29 compute-0 sudo[69826]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:30 compute-0 sudo[69982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqxxwhmlbhpjgukkkxkfgtalusinnqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349829.9252687-519-36436872287555/AnsiballZ_file.py'
Nov 28 17:10:30 compute-0 sudo[69982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:30 compute-0 python3.9[69984]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:30 compute-0 sudo[69982]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:30 compute-0 sudo[70134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xktkcwwbkmbwqciunbjxjnqjsohmogxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349830.6623688-535-74835894874887/AnsiballZ_stat.py'
Nov 28 17:10:30 compute-0 sudo[70134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:31 compute-0 python3.9[70136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:31 compute-0 sudo[70134]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:31 compute-0 sudo[70257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdtlnbiaccfwdopiazkzoacdkpfkqxwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349830.6623688-535-74835894874887/AnsiballZ_copy.py'
Nov 28 17:10:31 compute-0 sudo[70257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:31 compute-0 python3.9[70259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349830.6623688-535-74835894874887/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:31 compute-0 sudo[70257]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:32 compute-0 sudo[70409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loinynerwvzxidpocdixhalcpbrhjlib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349832.2876883-571-5555402145994/AnsiballZ_timezone.py'
Nov 28 17:10:32 compute-0 sudo[70409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:32 compute-0 python3.9[70411]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 17:10:32 compute-0 systemd[1]: Starting Time & Date Service...
Nov 28 17:10:33 compute-0 systemd[1]: Started Time & Date Service.
Nov 28 17:10:33 compute-0 sudo[70409]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:33 compute-0 sudo[70565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwwawuiblxefhsjmgwmwysleismzptth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349833.384987-589-208514908309236/AnsiballZ_file.py'
Nov 28 17:10:33 compute-0 sudo[70565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:33 compute-0 python3.9[70567]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:33 compute-0 sudo[70565]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:34 compute-0 sudo[70717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caohahhhginxtjcqglvhwiuutonagvzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349834.0962758-605-199555233909092/AnsiballZ_stat.py'
Nov 28 17:10:34 compute-0 sudo[70717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:34 compute-0 python3.9[70719]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:34 compute-0 sudo[70717]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:35 compute-0 sudo[70840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sveadrrsirvbxekbkwqkhdvjqqmjwoha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349834.0962758-605-199555233909092/AnsiballZ_copy.py'
Nov 28 17:10:35 compute-0 sudo[70840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:35 compute-0 python3.9[70842]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349834.0962758-605-199555233909092/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:35 compute-0 sudo[70840]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:35 compute-0 sudo[70992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trrjhqhcopmgjkvlslfwkzpzgpzfdhhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349835.5965388-635-171826048940431/AnsiballZ_stat.py'
Nov 28 17:10:35 compute-0 sudo[70992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:36 compute-0 python3.9[70994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:36 compute-0 sudo[70992]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:36 compute-0 sudo[71115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfrffretwmxmohbcxwqsdwufcdiddefm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349835.5965388-635-171826048940431/AnsiballZ_copy.py'
Nov 28 17:10:36 compute-0 sudo[71115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:36 compute-0 python3.9[71117]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349835.5965388-635-171826048940431/.source.yaml _original_basename=.pwwltrwf follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:36 compute-0 sudo[71115]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:37 compute-0 sudo[71267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpxqxslkwcpvlblhmhzfyynjckitnwmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349836.8414338-665-125577391346080/AnsiballZ_stat.py'
Nov 28 17:10:37 compute-0 sudo[71267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:37 compute-0 python3.9[71269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:37 compute-0 sudo[71267]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:37 compute-0 sudo[71390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfvfboxuudtjkeaymbyiyzptmatfvlzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349836.8414338-665-125577391346080/AnsiballZ_copy.py'
Nov 28 17:10:37 compute-0 sudo[71390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:37 compute-0 python3.9[71392]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349836.8414338-665-125577391346080/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:37 compute-0 sudo[71390]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:38 compute-0 sudo[71542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfuhcfovocxoyktzpvusyxuyfhukmsuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349838.2831147-695-165504150905695/AnsiballZ_command.py'
Nov 28 17:10:38 compute-0 sudo[71542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:38 compute-0 python3.9[71544]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:10:38 compute-0 sudo[71542]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:39 compute-0 sudo[71695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcqfagfnsquthtkgyntpfizzilkfgjtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349838.9902081-711-58978236006789/AnsiballZ_command.py'
Nov 28 17:10:39 compute-0 sudo[71695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:39 compute-0 python3.9[71697]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:10:39 compute-0 sudo[71695]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:40 compute-0 sudo[71848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekrzboktstrdvqmwncfsbuhiohncabzr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764349839.8623202-727-166815352837225/AnsiballZ_edpm_nftables_from_files.py'
Nov 28 17:10:40 compute-0 sudo[71848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:40 compute-0 python3[71850]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 17:10:40 compute-0 sudo[71848]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:41 compute-0 sudo[72000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhyxinqynbifuairkrlvsbsevssawvjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349840.7302113-743-66165938247504/AnsiballZ_stat.py'
Nov 28 17:10:41 compute-0 sudo[72000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:41 compute-0 python3.9[72002]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:41 compute-0 sudo[72000]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:41 compute-0 sudo[72123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyzlsgzgwvyoapkaikmfdlqmewvprbxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349840.7302113-743-66165938247504/AnsiballZ_copy.py'
Nov 28 17:10:41 compute-0 sudo[72123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:41 compute-0 python3.9[72125]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349840.7302113-743-66165938247504/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:41 compute-0 sudo[72123]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:42 compute-0 sudo[72275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slaffvstvxsqwpxilmjifjobglllucte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349842.1066442-773-184515639451071/AnsiballZ_stat.py'
Nov 28 17:10:42 compute-0 sudo[72275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:42 compute-0 python3.9[72277]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:42 compute-0 sudo[72275]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:43 compute-0 sudo[72398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jldkilojcklzhyyderwrilcqhuukguhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349842.1066442-773-184515639451071/AnsiballZ_copy.py'
Nov 28 17:10:43 compute-0 sudo[72398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:43 compute-0 python3.9[72400]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349842.1066442-773-184515639451071/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:43 compute-0 sudo[72398]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:43 compute-0 sudo[72550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwlclllcpkkuyagkqeprhrdnnoyhkpst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349843.6259003-803-222738896309023/AnsiballZ_stat.py'
Nov 28 17:10:43 compute-0 sudo[72550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:44 compute-0 python3.9[72552]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:44 compute-0 sudo[72550]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:44 compute-0 sudo[72673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwjtlyjgvykvfcspxenqbblibcpckgel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349843.6259003-803-222738896309023/AnsiballZ_copy.py'
Nov 28 17:10:44 compute-0 sudo[72673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:44 compute-0 python3.9[72675]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349843.6259003-803-222738896309023/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:44 compute-0 sudo[72673]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:45 compute-0 sudo[72825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypyfaskogdupogfkdnejpqzbvwmxdneo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349845.024901-833-267723411218004/AnsiballZ_stat.py'
Nov 28 17:10:45 compute-0 sudo[72825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:45 compute-0 python3.9[72827]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:45 compute-0 sudo[72825]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:45 compute-0 sudo[72948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptuhkbypgogdnmmmdiimomsipdpyydnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349845.024901-833-267723411218004/AnsiballZ_copy.py'
Nov 28 17:10:45 compute-0 sudo[72948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:46 compute-0 python3.9[72950]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349845.024901-833-267723411218004/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:46 compute-0 sudo[72948]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:46 compute-0 sudo[73100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubxbpwamnlyhsjnmxcdjkajtodibglgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349846.294567-863-150671942044183/AnsiballZ_stat.py'
Nov 28 17:10:46 compute-0 sudo[73100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:46 compute-0 python3.9[73102]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:10:46 compute-0 sudo[73100]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:47 compute-0 sudo[73223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctzobznxlriauzosnhfzxjkbfppujmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349846.294567-863-150671942044183/AnsiballZ_copy.py'
Nov 28 17:10:47 compute-0 sudo[73223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:47 compute-0 python3.9[73225]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349846.294567-863-150671942044183/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:47 compute-0 sudo[73223]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:48 compute-0 sudo[73375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiwanusdhyhfjrxvwsfdzvxhztlkctau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349847.7131689-893-159439132376720/AnsiballZ_file.py'
Nov 28 17:10:48 compute-0 sudo[73375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:48 compute-0 python3.9[73377]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:48 compute-0 sudo[73375]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:48 compute-0 sudo[73527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdqvhvftxtqckbbnbglkcotirsftfflf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349848.4619064-909-214651519016012/AnsiballZ_command.py'
Nov 28 17:10:48 compute-0 sudo[73527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:48 compute-0 python3.9[73529]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:10:49 compute-0 sudo[73527]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:49 compute-0 sudo[73686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbozsesxrrmdpghhnlvsufdoftjjcuuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349849.3114955-925-82306450980507/AnsiballZ_blockinfile.py'
Nov 28 17:10:49 compute-0 sudo[73686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:49 compute-0 python3.9[73688]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:49 compute-0 sudo[73686]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:50 compute-0 sudo[73839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atrkpvbideeqxnbmekmrqrgrkcotnfmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349850.28707-943-250182981165701/AnsiballZ_file.py'
Nov 28 17:10:50 compute-0 sudo[73839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:50 compute-0 python3.9[73841]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:50 compute-0 sudo[73839]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:51 compute-0 sudo[73991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxganodavvxuejvgyuacunpfailswned ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349850.952618-943-119181713309229/AnsiballZ_file.py'
Nov 28 17:10:51 compute-0 sudo[73991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:51 compute-0 python3.9[73993]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:10:51 compute-0 sudo[73991]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:52 compute-0 sudo[74143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpcurlrjxdigzcycuionqwyxgwosjwep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349851.664395-973-237035538559629/AnsiballZ_mount.py'
Nov 28 17:10:52 compute-0 sudo[74143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:52 compute-0 python3.9[74145]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 17:10:52 compute-0 sudo[74143]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:52 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:10:52 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:10:52 compute-0 sudo[74297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwblfdciriickgktceqpwkcwsorfaemh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349852.5081134-973-205570706841356/AnsiballZ_mount.py'
Nov 28 17:10:52 compute-0 sudo[74297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:10:53 compute-0 python3.9[74299]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 17:10:53 compute-0 sudo[74297]: pam_unix(sudo:session): session closed for user root
Nov 28 17:10:53 compute-0 sshd-session[65138]: Connection closed by 192.168.122.30 port 38676
Nov 28 17:10:53 compute-0 sshd-session[65135]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:10:53 compute-0 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Nov 28 17:10:53 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 28 17:10:53 compute-0 systemd[1]: session-15.scope: Consumed 34.643s CPU time.
Nov 28 17:10:53 compute-0 systemd-logind[788]: Removed session 15.
Nov 28 17:10:59 compute-0 sshd-session[74325]: Accepted publickey for zuul from 192.168.122.30 port 51210 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:10:59 compute-0 systemd-logind[788]: New session 16 of user zuul.
Nov 28 17:10:59 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 28 17:10:59 compute-0 sshd-session[74325]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:10:59 compute-0 sudo[74478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdcjqofbcxvoskqrfjckhpqmqyggdbwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349859.2828276-22-142795436778638/AnsiballZ_tempfile.py'
Nov 28 17:10:59 compute-0 sudo[74478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:00 compute-0 python3.9[74480]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 17:11:00 compute-0 sudo[74478]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:01 compute-0 sudo[74630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usiqppxavgsnilaoqdjurjgaklzdrhhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349860.8440301-46-213139430582411/AnsiballZ_stat.py'
Nov 28 17:11:01 compute-0 sudo[74630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:01 compute-0 python3.9[74632]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:11:01 compute-0 sudo[74630]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:03 compute-0 sudo[74782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibrcxsqnoenworpglgiqtiulmufvajf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349861.7375424-66-12356857908431/AnsiballZ_setup.py'
Nov 28 17:11:03 compute-0 sudo[74782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:03 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 17:11:03 compute-0 python3.9[74784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:11:03 compute-0 sudo[74782]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:04 compute-0 sudo[74936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klbgaekwoujhyyuefhbxvklwylnxbhkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349863.6257474-83-90852235378224/AnsiballZ_blockinfile.py'
Nov 28 17:11:04 compute-0 sudo[74936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:04 compute-0 python3.9[74938]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxT53rn8Ty9F7dAtRWYXBLcCwT80o1aLNX/jGeiduSSp5PQsoK0ZEoiARnwUv/uaV73bdLfhvA+K7D2A4F7RDF7RbifjJe1KnkzQSBGGz7XkVS4+VVe+2KccprDMTa3jzaFJhA9+PBqSFKvZ5ro2QzF0jMc/6994pH9++kClS4XbP7YWHTWDvAfnUggm/IWl+Eo6xAfHzDC4vUtvDVEY+6cll7csju0lY0qaF4ql2H1Q4LCgejQpZDRChjCJpBnc/tHo5xtMlh/0k+tZbFCtmZfbmvLqVxVNZUt1n5vt/FZmZjYXlO1lkUCpfjRCQcswh1zahRgeUWRc4FnD8Q9KME5OZY/irjZyHxj/3pXQv+fP4TQheYPFUaG15S89oHuC+SqSgVFLuiB542wxp6YKZbSQPeDcC8U2ibHrln8aY9eJ5Tdi2UykabwgN4uoEnO8NVJ+CwGfzJegd8pKyl+/2Gkfxu8iCajJu+Q7NxPQ66OZfe5XGsLR9MaLQNtDCFYU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEqyw570lNsBQsmtvGhWChq27EuSQf7MNY28byja8RZ0
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC7jpCHMim8nXp7AwrosMuPw+aK2S/LTgUszTJdTiF7osRDmEPuu6132abYZOcAcJksGOtmpktBQgsg+n3ZhPSQ=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChlGtpllN/b/R95xIcizpbeMwrxWH5KPGPQiTD4eroe4AgdYpUhDFdlXyAGtrfqmvXGNBWi3PIUnSKmmZjlUGSvwkfZ5HLQdpp0/QYGjPsCAzitx1hdX08ekxxqPwkTe3RkS1wdcPWpFP1TqlilJwY0HbjTMZ8+u4QfEc9P5VVfYI3wSGuqq2640h0WSBWxf0hNVbcyl85jx2yvllqm7PsnbYyXgdUZnH/OKwt8TSFLaFjjDZMGrEeDOEuLvbgxPjPX1AjpUdm6T1eESFIN7zH7QU82Ev39tYASuo8s649UF7nRAPv1H1oqb7DiPxnGWnQ0VJ/lHSPUCXrWOZQk5OeIlobwsbZloH1k0ysewLbBAUMFI1m808hu6eq0bkM0DOQnb23iBmPEhsnJWfqwIbtIduwyNXgwbF5qQFoqFazNndwbtzVTHvQWw6mejVYpMt5BiUnC1WSo9VHqaZpqr5gDS8ndHf6ASTvg+NLAwC82ZuLjScOwex2kplfESU6X28=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILQwJResDkYA3BLC7hS+fsJPG0etEm0fG8jqRFytEBVi
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE4mngiayFGXHxiukAOWP0RRIfNjqwEP36Op7VH4/DOv5DOtIk8xnwddqtcoGzqpWI9ICqKqUD4Vl72O76R/Eqs=
                                             create=True mode=0644 path=/tmp/ansible.cxehx1gy state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:04 compute-0 sudo[74936]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:04 compute-0 sudo[75088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-johmsyulvvroxtishlrgqclmkrhejibc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349864.4575233-99-221293828949059/AnsiballZ_command.py'
Nov 28 17:11:04 compute-0 sudo[75088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:05 compute-0 python3.9[75090]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.cxehx1gy' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:11:05 compute-0 sudo[75088]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:05 compute-0 sudo[75242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqiwjriprrdimnaepruycgljhbcvxhsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349865.2772298-115-118058166476835/AnsiballZ_file.py'
Nov 28 17:11:05 compute-0 sudo[75242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:05 compute-0 python3.9[75244]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.cxehx1gy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:05 compute-0 sudo[75242]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:06 compute-0 sshd-session[74328]: Connection closed by 192.168.122.30 port 51210
Nov 28 17:11:06 compute-0 sshd-session[74325]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:11:06 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 28 17:11:06 compute-0 systemd[1]: session-16.scope: Consumed 3.550s CPU time.
Nov 28 17:11:06 compute-0 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Nov 28 17:11:06 compute-0 systemd-logind[788]: Removed session 16.
Nov 28 17:11:12 compute-0 sshd-session[75269]: Accepted publickey for zuul from 192.168.122.30 port 53994 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:11:12 compute-0 systemd-logind[788]: New session 17 of user zuul.
Nov 28 17:11:12 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 28 17:11:12 compute-0 sshd-session[75269]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:11:13 compute-0 python3.9[75422]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:11:14 compute-0 sudo[75576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbsvtdlukccpxwwzzoyaeerqbjfnqgqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349873.9789844-49-241605406962184/AnsiballZ_systemd.py'
Nov 28 17:11:14 compute-0 sudo[75576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:15 compute-0 python3.9[75578]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 17:11:15 compute-0 sudo[75576]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:15 compute-0 sudo[75730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-askehluugfxtsdprnwebpjzntzdcxndz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349875.3580546-65-122543064593148/AnsiballZ_systemd.py'
Nov 28 17:11:15 compute-0 sudo[75730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:16 compute-0 python3.9[75732]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:11:16 compute-0 sudo[75730]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:16 compute-0 sudo[75883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wocpzlybxhvjhexehwvsllbdglrgwfcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349876.4133675-83-138944699325181/AnsiballZ_command.py'
Nov 28 17:11:16 compute-0 sudo[75883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:17 compute-0 python3.9[75885]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:11:17 compute-0 sudo[75883]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:17 compute-0 sudo[76036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unnyffxzvsxeywqtyshtuvaahnextgor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349877.3203151-99-279772732707279/AnsiballZ_stat.py'
Nov 28 17:11:17 compute-0 sudo[76036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:17 compute-0 python3.9[76038]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:11:17 compute-0 sudo[76036]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:18 compute-0 sudo[76190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sorfotakugrxtjjrhigqrutcmeeeupyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349878.202591-115-232771122188164/AnsiballZ_command.py'
Nov 28 17:11:18 compute-0 sudo[76190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:18 compute-0 python3.9[76192]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:11:18 compute-0 sudo[76190]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:19 compute-0 sudo[76345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfwbicwxqdmlpuiefgutnrpfafipbmqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349878.9219024-131-143908076374368/AnsiballZ_file.py'
Nov 28 17:11:19 compute-0 sudo[76345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:19 compute-0 python3.9[76347]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:19 compute-0 sudo[76345]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:20 compute-0 sshd-session[75272]: Connection closed by 192.168.122.30 port 53994
Nov 28 17:11:20 compute-0 sshd-session[75269]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:11:20 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 28 17:11:20 compute-0 systemd[1]: session-17.scope: Consumed 4.432s CPU time.
Nov 28 17:11:20 compute-0 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Nov 28 17:11:20 compute-0 systemd-logind[788]: Removed session 17.
Nov 28 17:11:25 compute-0 sshd-session[76372]: Accepted publickey for zuul from 192.168.122.30 port 45340 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:11:25 compute-0 systemd-logind[788]: New session 18 of user zuul.
Nov 28 17:11:25 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 28 17:11:25 compute-0 sshd-session[76372]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:11:26 compute-0 python3.9[76525]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:11:27 compute-0 sudo[76679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xximqvufkmlepveydilooxhevwekbokr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349886.9366987-52-77924018277380/AnsiballZ_setup.py'
Nov 28 17:11:27 compute-0 sudo[76679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:27 compute-0 python3.9[76681]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:11:27 compute-0 sudo[76679]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:28 compute-0 sudo[76763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwiwowsseakdlxxetjlghapuirgwmpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349886.9366987-52-77924018277380/AnsiballZ_dnf.py'
Nov 28 17:11:28 compute-0 sudo[76763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:28 compute-0 python3.9[76765]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 17:11:29 compute-0 sudo[76763]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:30 compute-0 python3.9[76916]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:11:32 compute-0 python3.9[77067]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:11:33 compute-0 python3.9[77217]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:11:33 compute-0 python3.9[77367]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:11:34 compute-0 sshd-session[76375]: Connection closed by 192.168.122.30 port 45340
Nov 28 17:11:34 compute-0 sshd-session[76372]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:11:34 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 28 17:11:34 compute-0 systemd[1]: session-18.scope: Consumed 5.929s CPU time.
Nov 28 17:11:34 compute-0 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Nov 28 17:11:34 compute-0 systemd-logind[788]: Removed session 18.
Nov 28 17:11:39 compute-0 sshd-session[77392]: Accepted publickey for zuul from 192.168.122.30 port 38648 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:11:39 compute-0 systemd-logind[788]: New session 19 of user zuul.
Nov 28 17:11:39 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 28 17:11:39 compute-0 sshd-session[77392]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:11:40 compute-0 python3.9[77545]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:11:42 compute-0 sudo[77699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzahoqpxnmrhpxfewwcnipwahxrozaqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349901.6785095-85-124375539126455/AnsiballZ_file.py'
Nov 28 17:11:42 compute-0 sudo[77699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:42 compute-0 python3.9[77701]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:42 compute-0 sudo[77699]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:42 compute-0 sudo[77851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gppwjtjysnkkejbbvmncbjnisfwylmry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349902.4820137-85-61954462055802/AnsiballZ_file.py'
Nov 28 17:11:42 compute-0 sudo[77851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:42 compute-0 python3.9[77853]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:42 compute-0 sudo[77851]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:43 compute-0 sudo[78003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccrmsbxskyntxigzsarjspphqkyrdcxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349903.1529677-114-137146231020976/AnsiballZ_stat.py'
Nov 28 17:11:43 compute-0 sudo[78003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:43 compute-0 python3.9[78005]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:43 compute-0 sudo[78003]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:44 compute-0 sudo[78126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeenzwobiuvxmtvxnnegtibtrgmibqou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349903.1529677-114-137146231020976/AnsiballZ_copy.py'
Nov 28 17:11:44 compute-0 sudo[78126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:44 compute-0 python3.9[78128]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349903.1529677-114-137146231020976/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=b339a98dfe30d906495ecb14e6486ec894d98a68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:44 compute-0 sudo[78126]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:44 compute-0 sudo[78278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iltmedtwcylaenqokdtavvudhqprcmit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349904.6401846-114-228880671092823/AnsiballZ_stat.py'
Nov 28 17:11:44 compute-0 sudo[78278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:45 compute-0 python3.9[78280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:45 compute-0 sudo[78278]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:45 compute-0 sudo[78401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onojbwnpdnwdxmtirvhpbqzqmbzesnot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349904.6401846-114-228880671092823/AnsiballZ_copy.py'
Nov 28 17:11:45 compute-0 sudo[78401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:45 compute-0 python3.9[78403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349904.6401846-114-228880671092823/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=371b47e6696fefbe67908b0e0990da7c84cd74b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:45 compute-0 sudo[78401]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:46 compute-0 sudo[78553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojhqsbvcmnpimoydqkdmkpzusmixjpxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349905.8139474-114-57528445083983/AnsiballZ_stat.py'
Nov 28 17:11:46 compute-0 sudo[78553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:46 compute-0 python3.9[78555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:46 compute-0 sudo[78553]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:46 compute-0 sudo[78676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqesmnxypoixwcaigflaqmkajnoiybfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349905.8139474-114-57528445083983/AnsiballZ_copy.py'
Nov 28 17:11:46 compute-0 sudo[78676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:46 compute-0 python3.9[78678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349905.8139474-114-57528445083983/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3ccf76f6ab5d2124693582bc878296db17d6b537 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:46 compute-0 sudo[78676]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:47 compute-0 sudo[78828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrmmxpgwuymjueacwiwuxscxtqffxfdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349907.157229-201-21628488239975/AnsiballZ_file.py'
Nov 28 17:11:47 compute-0 sudo[78828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:47 compute-0 python3.9[78830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:47 compute-0 sudo[78828]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:48 compute-0 sudo[78980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odtzqwdpdpnyjlrckanwxdtuefhbgdgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349907.7720103-201-261225865972108/AnsiballZ_file.py'
Nov 28 17:11:48 compute-0 sudo[78980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:48 compute-0 python3.9[78982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:48 compute-0 sudo[78980]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:48 compute-0 sudo[79132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdfkiipbweorvhhspnwusqjqwvhtzxlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349908.421102-226-99380827028337/AnsiballZ_stat.py'
Nov 28 17:11:48 compute-0 sudo[79132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:48 compute-0 python3.9[79134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:48 compute-0 sudo[79132]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:49 compute-0 sudo[79255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcsdrguxujgeddkgbnqvazrqtnpehchk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349908.421102-226-99380827028337/AnsiballZ_copy.py'
Nov 28 17:11:49 compute-0 sudo[79255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:49 compute-0 python3.9[79257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349908.421102-226-99380827028337/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=57ff654197e253ff912df088833247fe4109eec8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:49 compute-0 sudo[79255]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:49 compute-0 sudo[79407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kobrbcwchbppjxtottfvhqzwspeancbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349909.639231-226-139670614255776/AnsiballZ_stat.py'
Nov 28 17:11:49 compute-0 sudo[79407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:50 compute-0 python3.9[79409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:50 compute-0 sudo[79407]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:50 compute-0 sudo[79530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnnzokgrdbkllplhbzrpvlsvmxbczabv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349909.639231-226-139670614255776/AnsiballZ_copy.py'
Nov 28 17:11:50 compute-0 sudo[79530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:50 compute-0 python3.9[79532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349909.639231-226-139670614255776/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e9bf49954c56a5ef0116bec3430c6931117bb468 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:50 compute-0 sudo[79530]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:51 compute-0 sudo[79682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqspqjmolwnpgvvtmqhqsdvxdcixnear ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349910.83052-226-240535386208560/AnsiballZ_stat.py'
Nov 28 17:11:51 compute-0 sudo[79682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:51 compute-0 python3.9[79684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:51 compute-0 sudo[79682]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:51 compute-0 sudo[79805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyqnfzqvwplulvrkxekyugczfayviqas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349910.83052-226-240535386208560/AnsiballZ_copy.py'
Nov 28 17:11:51 compute-0 sudo[79805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:51 compute-0 python3.9[79807]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349910.83052-226-240535386208560/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a17a1231372c4098f9ef6f8839f34149f930620f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:51 compute-0 sudo[79805]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:52 compute-0 sudo[79957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcstabmwjgjfrsqilbobohlvtstnvvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349912.0815446-315-63169975800517/AnsiballZ_file.py'
Nov 28 17:11:52 compute-0 sudo[79957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:52 compute-0 python3.9[79959]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:52 compute-0 sudo[79957]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:52 compute-0 sudo[80109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pazyajocfzoofbnxbejlmecfxkfvgcha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349912.6548283-315-163526054235117/AnsiballZ_file.py'
Nov 28 17:11:52 compute-0 sudo[80109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:53 compute-0 python3.9[80111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:53 compute-0 sudo[80109]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:53 compute-0 sudo[80261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklalqmzcsbmxwqjepkjgoyaixhzynpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349913.365547-346-212717103886860/AnsiballZ_stat.py'
Nov 28 17:11:53 compute-0 sudo[80261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:53 compute-0 python3.9[80263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:53 compute-0 sudo[80261]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:54 compute-0 sudo[80384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnhreygashqouughgwvvramgpdavjcnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349913.365547-346-212717103886860/AnsiballZ_copy.py'
Nov 28 17:11:54 compute-0 sudo[80384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:54 compute-0 python3.9[80386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349913.365547-346-212717103886860/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8f7873bfb46bd3767990008e297c5c006052bb79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:54 compute-0 sudo[80384]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:54 compute-0 sudo[80536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqtemyagzxutxkumwrxuzrsoeywgcwtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349914.5435448-346-168301023921249/AnsiballZ_stat.py'
Nov 28 17:11:54 compute-0 sudo[80536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:54 compute-0 python3.9[80538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:54 compute-0 sudo[80536]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:55 compute-0 sudo[80659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atjnkmyzysvhlxnxsgeibctqdpmhvyjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349914.5435448-346-168301023921249/AnsiballZ_copy.py'
Nov 28 17:11:55 compute-0 sudo[80659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:55 compute-0 python3.9[80661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349914.5435448-346-168301023921249/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=9833c8350df867b07465cd9417ef9c0f8196c547 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:55 compute-0 sudo[80659]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:56 compute-0 sudo[80811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thkqyeksbhuewbomzhcnojoymmxzfror ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349915.934677-346-68404121911511/AnsiballZ_stat.py'
Nov 28 17:11:56 compute-0 sudo[80811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:56 compute-0 python3.9[80813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:56 compute-0 sudo[80811]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:56 compute-0 sudo[80934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lltavgvcjaecxtkbfacldpzztwsjhwqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349915.934677-346-68404121911511/AnsiballZ_copy.py'
Nov 28 17:11:56 compute-0 sudo[80934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:56 compute-0 python3.9[80936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349915.934677-346-68404121911511/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=4c030dffb9a2f79275b80ad6ffa8f9d4cbca89fc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:56 compute-0 sudo[80934]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:57 compute-0 sudo[81086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erajiutdwyjbltxtrbdvrhwzyboxtjty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349917.310047-434-84731034111514/AnsiballZ_file.py'
Nov 28 17:11:57 compute-0 sudo[81086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:57 compute-0 python3.9[81088]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:57 compute-0 sudo[81086]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:58 compute-0 sudo[81238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhvkuizenkpmitvpaznhjknrtrrpzqoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349917.9462647-434-97377321717579/AnsiballZ_file.py'
Nov 28 17:11:58 compute-0 sudo[81238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:58 compute-0 python3.9[81240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:11:58 compute-0 sudo[81238]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:58 compute-0 sudo[81390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmyjtkudqlbkjxcztbtnflkrnsmizalk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349918.543146-462-161157732086373/AnsiballZ_stat.py'
Nov 28 17:11:58 compute-0 sudo[81390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:58 compute-0 python3.9[81392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:11:58 compute-0 sudo[81390]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:59 compute-0 sudo[81513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubmnjikzmptggsrrghdssuuvwihqedv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349918.543146-462-161157732086373/AnsiballZ_copy.py'
Nov 28 17:11:59 compute-0 sudo[81513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:11:59 compute-0 python3.9[81515]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349918.543146-462-161157732086373/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d97cf8c443e49a4e79d16b6de3dd174cb7d82750 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:11:59 compute-0 sudo[81513]: pam_unix(sudo:session): session closed for user root
Nov 28 17:11:59 compute-0 sudo[81665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgpdqjxcqykruplhhrahvdysigxgqaha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349919.645347-462-132253594645679/AnsiballZ_stat.py'
Nov 28 17:11:59 compute-0 sudo[81665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:00 compute-0 python3.9[81667]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:00 compute-0 sudo[81665]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:00 compute-0 sudo[81788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daugolrgnkjqiiwxrdmlrlyihsegtqoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349919.645347-462-132253594645679/AnsiballZ_copy.py'
Nov 28 17:12:00 compute-0 sudo[81788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:00 compute-0 python3.9[81790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349919.645347-462-132253594645679/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=9833c8350df867b07465cd9417ef9c0f8196c547 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:00 compute-0 sudo[81788]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:01 compute-0 sudo[81940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllqxoiggxqjqjftoplusuztxwzrpxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349920.8694305-462-156466057255164/AnsiballZ_stat.py'
Nov 28 17:12:01 compute-0 sudo[81940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:01 compute-0 python3.9[81942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:01 compute-0 sudo[81940]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:01 compute-0 chronyd[65109]: Selected source 23.133.168.247 (pool.ntp.org)
Nov 28 17:12:01 compute-0 sudo[82063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqumducjlwqtfurhgiqqbzwvsbbjbdxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349920.8694305-462-156466057255164/AnsiballZ_copy.py'
Nov 28 17:12:01 compute-0 sudo[82063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:01 compute-0 python3.9[82065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349920.8694305-462-156466057255164/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=9e108e1e3189a09b6bfbef5a33c5a85e4b603715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:01 compute-0 sudo[82063]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:02 compute-0 sudo[82215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpvqikvdjlsnjxoynorsukthmnaimhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349922.5359023-571-210378187387821/AnsiballZ_file.py'
Nov 28 17:12:02 compute-0 sudo[82215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:03 compute-0 python3.9[82217]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:03 compute-0 sudo[82215]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:03 compute-0 sudo[82367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgfohihgulafxgkeekqrwidoafrvzblw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349923.3203542-596-142605867447200/AnsiballZ_stat.py'
Nov 28 17:12:03 compute-0 sudo[82367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:03 compute-0 python3.9[82369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:03 compute-0 sudo[82367]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:04 compute-0 sudo[82490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdrxvmsahxujxwemaviybnubhbxggkqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349923.3203542-596-142605867447200/AnsiballZ_copy.py'
Nov 28 17:12:04 compute-0 sudo[82490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:04 compute-0 python3.9[82492]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349923.3203542-596-142605867447200/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:04 compute-0 sudo[82490]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:04 compute-0 sudo[82642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szexazypriaskavgegqckmcspqhaftth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349924.6037052-628-83839198912620/AnsiballZ_file.py'
Nov 28 17:12:04 compute-0 sudo[82642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:05 compute-0 python3.9[82644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:05 compute-0 sudo[82642]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:05 compute-0 sudo[82794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejyxjfqvonofvkkvakwwnldcnjqpmgvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349925.2270503-643-271132791682327/AnsiballZ_stat.py'
Nov 28 17:12:05 compute-0 sudo[82794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:05 compute-0 python3.9[82796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:05 compute-0 sudo[82794]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:06 compute-0 sudo[82917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueghpnptkzvbefuosrtjuncwwtwrlehi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349925.2270503-643-271132791682327/AnsiballZ_copy.py'
Nov 28 17:12:06 compute-0 sudo[82917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:06 compute-0 python3.9[82919]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349925.2270503-643-271132791682327/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:06 compute-0 sudo[82917]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:06 compute-0 sudo[83069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqpecnzvushboawijxnygqlwxzqcfpmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349926.448213-676-54034355823364/AnsiballZ_file.py'
Nov 28 17:12:06 compute-0 sudo[83069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:06 compute-0 python3.9[83071]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:06 compute-0 sudo[83069]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:07 compute-0 sudo[83221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jynpfymlidaakegxctvsrqkatqxugbxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349927.0724409-690-68978657002736/AnsiballZ_stat.py'
Nov 28 17:12:07 compute-0 sudo[83221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:07 compute-0 python3.9[83223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:07 compute-0 sudo[83221]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:07 compute-0 sudo[83344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fszmfjhaflqphnrqjblflqnhftyhcxrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349927.0724409-690-68978657002736/AnsiballZ_copy.py'
Nov 28 17:12:07 compute-0 sudo[83344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:08 compute-0 python3.9[83346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349927.0724409-690-68978657002736/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:08 compute-0 sudo[83344]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:08 compute-0 sudo[83496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eouaqkbifgnogmsmuufpcxdutngorapg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349928.4016254-723-172698446365707/AnsiballZ_file.py'
Nov 28 17:12:08 compute-0 sudo[83496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:08 compute-0 python3.9[83498]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:08 compute-0 sudo[83496]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:09 compute-0 sudo[83648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nctxutsasorulbgpppqnhljumttnycjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349929.0977361-739-140844782471132/AnsiballZ_stat.py'
Nov 28 17:12:09 compute-0 sudo[83648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:09 compute-0 python3.9[83650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:09 compute-0 sudo[83648]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:09 compute-0 sudo[83771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dndjonrdfcltnkmyvwbogpdzvzcgjuch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349929.0977361-739-140844782471132/AnsiballZ_copy.py'
Nov 28 17:12:09 compute-0 sudo[83771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:10 compute-0 python3.9[83773]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349929.0977361-739-140844782471132/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:10 compute-0 sudo[83771]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:10 compute-0 sudo[83923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvynwgfvaqywwzcaxuyzvlzfjkmqrsxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349930.3012764-771-168804153684097/AnsiballZ_file.py'
Nov 28 17:12:10 compute-0 sudo[83923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:10 compute-0 python3.9[83925]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:10 compute-0 sudo[83923]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:11 compute-0 sudo[84075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsfmmjwslysxvzkfxffkmnnvpctocgeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349930.9114277-787-211511935497787/AnsiballZ_stat.py'
Nov 28 17:12:11 compute-0 sudo[84075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:11 compute-0 python3.9[84077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:11 compute-0 sudo[84075]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:11 compute-0 sudo[84198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uazhprkwelnfjdcziilwrcuwcvjbluwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349930.9114277-787-211511935497787/AnsiballZ_copy.py'
Nov 28 17:12:11 compute-0 sudo[84198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:11 compute-0 python3.9[84200]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349930.9114277-787-211511935497787/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:11 compute-0 sudo[84198]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:12 compute-0 sudo[84350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvjokfstsaslwliyuhlxjrsounnwfrsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349932.0838523-816-205666375805428/AnsiballZ_file.py'
Nov 28 17:12:12 compute-0 sudo[84350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:12 compute-0 python3.9[84352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:12 compute-0 sudo[84350]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:12 compute-0 sudo[84502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bubzdxtprdnwsgpgmrifeuhcpuswzqpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349932.7015934-833-31857060007291/AnsiballZ_stat.py'
Nov 28 17:12:12 compute-0 sudo[84502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:13 compute-0 python3.9[84504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:13 compute-0 sudo[84502]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:13 compute-0 sudo[84625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldzskjswmwuzscapuvxfkxotqkpxblvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349932.7015934-833-31857060007291/AnsiballZ_copy.py'
Nov 28 17:12:13 compute-0 sudo[84625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:13 compute-0 python3.9[84627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349932.7015934-833-31857060007291/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:13 compute-0 sudo[84625]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:14 compute-0 sudo[84777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppxpltnvretqomooddxrjpkuqyaikinx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349933.9236455-864-192315946408265/AnsiballZ_file.py'
Nov 28 17:12:14 compute-0 sudo[84777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:14 compute-0 python3.9[84779]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:14 compute-0 sudo[84777]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:14 compute-0 sudo[84929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kazozflwsuhcmptkpnogfxbvkulilgds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349934.561871-878-139456281347185/AnsiballZ_stat.py'
Nov 28 17:12:14 compute-0 sudo[84929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:15 compute-0 python3.9[84931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:15 compute-0 sudo[84929]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:15 compute-0 sudo[85052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elvtrpjvafnwnvkmbbgervwaoegmfytu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349934.561871-878-139456281347185/AnsiballZ_copy.py'
Nov 28 17:12:15 compute-0 sudo[85052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:15 compute-0 python3.9[85054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349934.561871-878-139456281347185/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b43badc3cbdbb105df5fde112b52524c7b9f08f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:15 compute-0 sudo[85052]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:17 compute-0 sshd-session[77395]: Connection closed by 192.168.122.30 port 38648
Nov 28 17:12:17 compute-0 sshd-session[77392]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:12:17 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 28 17:12:17 compute-0 systemd[1]: session-19.scope: Consumed 27.686s CPU time.
Nov 28 17:12:17 compute-0 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Nov 28 17:12:17 compute-0 systemd-logind[788]: Removed session 19.
Nov 28 17:12:23 compute-0 sshd-session[85079]: Accepted publickey for zuul from 192.168.122.30 port 59564 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:12:23 compute-0 systemd-logind[788]: New session 20 of user zuul.
Nov 28 17:12:23 compute-0 systemd[1]: Started Session 20 of User zuul.
Nov 28 17:12:23 compute-0 sshd-session[85079]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:12:24 compute-0 python3.9[85232]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:12:25 compute-0 sudo[85386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztrkukmawhvdhcmiyygcwhqnzbjwbwpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349945.4493504-53-273856748018714/AnsiballZ_file.py'
Nov 28 17:12:25 compute-0 sudo[85386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:26 compute-0 python3.9[85388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:26 compute-0 sudo[85386]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:26 compute-0 sudo[85538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqeicdkpstslhpgidnpcshjczeoitulp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349946.3132222-53-69492845868168/AnsiballZ_file.py'
Nov 28 17:12:26 compute-0 sudo[85538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:26 compute-0 python3.9[85540]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:12:26 compute-0 sudo[85538]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:27 compute-0 python3.9[85690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:12:28 compute-0 sudo[85840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulxqsjhzcctosqdhbdecwqnbgqwrkwyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349947.8112671-99-143064795450425/AnsiballZ_seboolean.py'
Nov 28 17:12:28 compute-0 sudo[85840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:28 compute-0 python3.9[85842]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 17:12:29 compute-0 sudo[85840]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:31 compute-0 sudo[85996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxrtdujmskhipxabrhnegpnsqyrlidn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349951.588223-119-217229715100220/AnsiballZ_setup.py'
Nov 28 17:12:31 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 28 17:12:31 compute-0 sudo[85996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:32 compute-0 python3.9[85998]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:12:32 compute-0 sudo[85996]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:32 compute-0 sudo[86080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtegwjhmzvclbdvtsgligxsbhzqrihgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349951.588223-119-217229715100220/AnsiballZ_dnf.py'
Nov 28 17:12:32 compute-0 sudo[86080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:33 compute-0 python3.9[86082]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:12:34 compute-0 sudo[86080]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:35 compute-0 sudo[86233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxlokdryzwnwgwnylujbukopikybsbau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349954.7264726-143-251369271521851/AnsiballZ_systemd.py'
Nov 28 17:12:35 compute-0 sudo[86233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:35 compute-0 python3.9[86235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:12:35 compute-0 sudo[86233]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:36 compute-0 sudo[86388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akmgrtpuzrqqrxcnwuvhcfktipgvvuom ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764349956.0012357-159-5837843849444/AnsiballZ_edpm_nftables_snippet.py'
Nov 28 17:12:36 compute-0 sudo[86388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:36 compute-0 python3[86390]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 28 17:12:36 compute-0 sudo[86388]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:37 compute-0 sudo[86540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnlpmxbnonusdfwroyvwzljlysxtgvyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349956.9654093-177-147121426277246/AnsiballZ_file.py'
Nov 28 17:12:37 compute-0 sudo[86540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:37 compute-0 python3.9[86542]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:37 compute-0 sudo[86540]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:38 compute-0 sudo[86692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuclvavqbtcasgveeqilwkumfojdrdln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349957.6432328-193-135692098494221/AnsiballZ_stat.py'
Nov 28 17:12:38 compute-0 sudo[86692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:38 compute-0 python3.9[86694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:38 compute-0 sudo[86692]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:38 compute-0 sudo[86770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhuxwkdgjgykvolllrpoccepsdolbakc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349957.6432328-193-135692098494221/AnsiballZ_file.py'
Nov 28 17:12:38 compute-0 sudo[86770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:38 compute-0 python3.9[86772]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:38 compute-0 sudo[86770]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:39 compute-0 sudo[86922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbhgyhekkszvdcpkbcfsixjcobsiserz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349958.8958054-217-268238452825961/AnsiballZ_stat.py'
Nov 28 17:12:39 compute-0 sudo[86922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:39 compute-0 python3.9[86924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:39 compute-0 sudo[86922]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:39 compute-0 sudo[87000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynbdhncakzphyqynfzcuexlxvjimmxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349958.8958054-217-268238452825961/AnsiballZ_file.py'
Nov 28 17:12:39 compute-0 sudo[87000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:39 compute-0 python3.9[87002]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.0jd6gimk recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:39 compute-0 sudo[87000]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:40 compute-0 sudo[87152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqohlwazjyxnfuoyuijajquchmwtaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349960.0374112-241-40726531087147/AnsiballZ_stat.py'
Nov 28 17:12:40 compute-0 sudo[87152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:40 compute-0 python3.9[87154]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:40 compute-0 sudo[87152]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:40 compute-0 sudo[87230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxloaecgidablzijiqtruwbddoyvtilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349960.0374112-241-40726531087147/AnsiballZ_file.py'
Nov 28 17:12:40 compute-0 sudo[87230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:41 compute-0 python3.9[87232]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:41 compute-0 sudo[87230]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:41 compute-0 sudo[87382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlpgqlkkienatbyfrthmloapbwhmqrxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349961.3977382-267-86710570234547/AnsiballZ_command.py'
Nov 28 17:12:41 compute-0 sudo[87382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:42 compute-0 python3.9[87384]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:42 compute-0 sudo[87382]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:42 compute-0 sudo[87535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmbwrozxudzjowjsvbhbuagbsgqbqmgg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764349962.296619-283-261082680136850/AnsiballZ_edpm_nftables_from_files.py'
Nov 28 17:12:42 compute-0 sudo[87535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:42 compute-0 python3[87537]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 17:12:42 compute-0 sudo[87535]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:43 compute-0 sudo[87687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyfjblwyqrxwovgvspcuvufphuvhsbma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349963.1760578-299-276197522795349/AnsiballZ_stat.py'
Nov 28 17:12:43 compute-0 sudo[87687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:43 compute-0 python3.9[87689]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:43 compute-0 sudo[87687]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:44 compute-0 sudo[87812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmstjfnxeykcxeafwhteqlgzkpydzgxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349963.1760578-299-276197522795349/AnsiballZ_copy.py'
Nov 28 17:12:44 compute-0 sudo[87812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:44 compute-0 python3.9[87814]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349963.1760578-299-276197522795349/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:44 compute-0 sudo[87812]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:45 compute-0 sudo[87964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cirtllbszjzjevjnktkyylmudebyechf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349964.7438498-329-260495729486565/AnsiballZ_stat.py'
Nov 28 17:12:45 compute-0 sudo[87964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:45 compute-0 python3.9[87966]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:45 compute-0 sudo[87964]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:45 compute-0 sudo[88089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adqfyjspdlpvuvouuhluzddghqkjzcnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349964.7438498-329-260495729486565/AnsiballZ_copy.py'
Nov 28 17:12:45 compute-0 sudo[88089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:45 compute-0 python3.9[88091]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349964.7438498-329-260495729486565/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:45 compute-0 sudo[88089]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:46 compute-0 sudo[88241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wflmcjdqlpvmycuexhoxqnqgkespekxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349965.9957237-359-23621678102626/AnsiballZ_stat.py'
Nov 28 17:12:46 compute-0 sudo[88241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:46 compute-0 python3.9[88243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:46 compute-0 sudo[88241]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:46 compute-0 sudo[88366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszaajwsmytydkobpzrydqjpkcmfbtgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349965.9957237-359-23621678102626/AnsiballZ_copy.py'
Nov 28 17:12:46 compute-0 sudo[88366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:47 compute-0 python3.9[88368]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349965.9957237-359-23621678102626/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:47 compute-0 sudo[88366]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:47 compute-0 sudo[88518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpswnlcltwdgmzpujbbwrtysgdtfgkni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349967.229879-389-106655386166041/AnsiballZ_stat.py'
Nov 28 17:12:47 compute-0 sudo[88518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:47 compute-0 python3.9[88520]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:47 compute-0 sudo[88518]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:48 compute-0 sudo[88643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpklluzoevbqzxajnjsialmwfeeztiso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349967.229879-389-106655386166041/AnsiballZ_copy.py'
Nov 28 17:12:48 compute-0 sudo[88643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:48 compute-0 python3.9[88645]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349967.229879-389-106655386166041/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:48 compute-0 sudo[88643]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:49 compute-0 sudo[88795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwonkybxlabyohaanqdramprzclyeejh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349968.678956-419-41960400360120/AnsiballZ_stat.py'
Nov 28 17:12:49 compute-0 sudo[88795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:49 compute-0 python3.9[88797]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:12:49 compute-0 sudo[88795]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:49 compute-0 sudo[88920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-selkfxrjeowsbtlarrxlpmxvxwhjfyer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349968.678956-419-41960400360120/AnsiballZ_copy.py'
Nov 28 17:12:49 compute-0 sudo[88920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:49 compute-0 python3.9[88922]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764349968.678956-419-41960400360120/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:49 compute-0 sudo[88920]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:50 compute-0 sudo[89072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqsxrchxpvfabrqlndwguowgizvemxtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349969.980763-449-162385333068129/AnsiballZ_file.py'
Nov 28 17:12:50 compute-0 sudo[89072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:50 compute-0 python3.9[89074]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:50 compute-0 sudo[89072]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:50 compute-0 sudo[89224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqyzpqudqquikavnbxgaajroffrsqmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349970.6651328-465-160113706089747/AnsiballZ_command.py'
Nov 28 17:12:50 compute-0 sudo[89224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:51 compute-0 python3.9[89226]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:51 compute-0 sudo[89224]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:51 compute-0 sudo[89379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxzxfgyfdqujqmalorskxymcxwpqcykp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349971.3144307-481-113930985427135/AnsiballZ_blockinfile.py'
Nov 28 17:12:51 compute-0 sudo[89379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:52 compute-0 python3.9[89381]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:52 compute-0 sudo[89379]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:52 compute-0 sudo[89531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewseorwgkltbgsaglmjjeguyjmpmorne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349972.292236-499-244192670201255/AnsiballZ_command.py'
Nov 28 17:12:52 compute-0 sudo[89531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:52 compute-0 python3.9[89533]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:52 compute-0 sudo[89531]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:53 compute-0 sudo[89684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmxwgntawcnvbzeouskreznxfzfufqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349973.1125133-515-155232891252090/AnsiballZ_stat.py'
Nov 28 17:12:53 compute-0 sudo[89684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:53 compute-0 python3.9[89686]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:12:53 compute-0 sudo[89684]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:54 compute-0 sudo[89838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvcnpitadxgwoujbwrjmnoozbkygicfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349973.8685994-531-112114069598164/AnsiballZ_command.py'
Nov 28 17:12:54 compute-0 sudo[89838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:54 compute-0 python3.9[89840]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:54 compute-0 sudo[89838]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:55 compute-0 sudo[89993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bijxiolkkptlqjxujgeccvjyzqehexjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349974.6372564-547-35196294572470/AnsiballZ_file.py'
Nov 28 17:12:55 compute-0 sudo[89993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:55 compute-0 python3.9[89995]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:12:55 compute-0 sudo[89993]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:56 compute-0 python3.9[90145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:12:57 compute-0 sudo[90296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giggpzxmaxfqbarofftsodpoalqkypyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349977.2936597-627-84544849622653/AnsiballZ_command.py'
Nov 28 17:12:57 compute-0 sudo[90296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:57 compute-0 python3.9[90298]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:57 compute-0 ovs-vsctl[90299]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 28 17:12:57 compute-0 sudo[90296]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:58 compute-0 sudo[90449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afuphaxsidkyjwjxnsaykhlsedoxnjcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349978.1331422-645-19066608567599/AnsiballZ_command.py'
Nov 28 17:12:58 compute-0 sudo[90449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:58 compute-0 python3.9[90451]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:58 compute-0 sudo[90449]: pam_unix(sudo:session): session closed for user root
Nov 28 17:12:59 compute-0 sudo[90604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azufjvkbnupzdfiztvhdbyfuvsxotvyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349978.9411628-661-163146959290051/AnsiballZ_command.py'
Nov 28 17:12:59 compute-0 sudo[90604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:12:59 compute-0 python3.9[90606]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:12:59 compute-0 ovs-vsctl[90607]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 28 17:12:59 compute-0 sudo[90604]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:00 compute-0 python3.9[90757]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:13:00 compute-0 sudo[90909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rchozmjjegmijaocjdyqskiqqghemxmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349980.4548464-695-162947038698946/AnsiballZ_file.py'
Nov 28 17:13:00 compute-0 sudo[90909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:00 compute-0 python3.9[90911]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:00 compute-0 sudo[90909]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:01 compute-0 sudo[91061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdtdnqdoncrfljmkewecsjvbfruhbxyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349981.1848814-711-121883544203176/AnsiballZ_stat.py'
Nov 28 17:13:01 compute-0 sudo[91061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:01 compute-0 python3.9[91063]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:01 compute-0 sudo[91061]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:01 compute-0 sudo[91139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsgmzmluosfcwbdeemfsqjmlsmzoomsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349981.1848814-711-121883544203176/AnsiballZ_file.py'
Nov 28 17:13:01 compute-0 sudo[91139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:02 compute-0 python3.9[91141]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:02 compute-0 sudo[91139]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:02 compute-0 sudo[91291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsnglcqjscnsqvpnhqlrrnmucyhvzek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349982.316233-711-8202782611622/AnsiballZ_stat.py'
Nov 28 17:13:02 compute-0 sudo[91291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:02 compute-0 python3.9[91293]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:02 compute-0 sudo[91291]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:03 compute-0 sudo[91369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjobbigfusrpsgkbgzalsjpnuuwqdova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349982.316233-711-8202782611622/AnsiballZ_file.py'
Nov 28 17:13:03 compute-0 sudo[91369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:03 compute-0 python3.9[91371]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:03 compute-0 sudo[91369]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:03 compute-0 sudo[91521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wimgcnthmgdorhieyqvuxbbfxyjjgngv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349983.6503441-757-161535855863831/AnsiballZ_file.py'
Nov 28 17:13:03 compute-0 sudo[91521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:04 compute-0 python3.9[91523]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:04 compute-0 sudo[91521]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:04 compute-0 sudo[91673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjtcsguximonuforvvwyzvjxlwjofzzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349984.4074163-773-126715906099299/AnsiballZ_stat.py'
Nov 28 17:13:04 compute-0 sudo[91673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:04 compute-0 python3.9[91675]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:04 compute-0 sudo[91673]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:05 compute-0 sudo[91751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjcuvyidymzdklmdfbtpcgcrzilnzvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349984.4074163-773-126715906099299/AnsiballZ_file.py'
Nov 28 17:13:05 compute-0 sudo[91751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:05 compute-0 python3.9[91753]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:05 compute-0 sudo[91751]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:05 compute-0 sudo[91903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-filrxakmncwyvwpuduwfzyfroojokjfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349985.616349-797-221156950137525/AnsiballZ_stat.py'
Nov 28 17:13:05 compute-0 sudo[91903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:06 compute-0 python3.9[91905]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:06 compute-0 sudo[91903]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:06 compute-0 sudo[91981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yujdjxfqlvkpfrezeuuztpulwfxrrxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349985.616349-797-221156950137525/AnsiballZ_file.py'
Nov 28 17:13:06 compute-0 sudo[91981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:07 compute-0 python3.9[91983]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:07 compute-0 sudo[91981]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:07 compute-0 sudo[92133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syfczpzzgqldapccjcktkiqjljvzabqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349987.1779904-821-154754577909946/AnsiballZ_systemd.py'
Nov 28 17:13:07 compute-0 sudo[92133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:07 compute-0 python3.9[92135]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:13:07 compute-0 systemd[1]: Reloading.
Nov 28 17:13:07 compute-0 systemd-sysv-generator[92161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:13:07 compute-0 systemd-rc-local-generator[92155]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:13:08 compute-0 sudo[92133]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:08 compute-0 sudo[92322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azzxbudxgdwzqiijixylejzttdvtgwnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349988.3158417-837-227162160349731/AnsiballZ_stat.py'
Nov 28 17:13:08 compute-0 sudo[92322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:08 compute-0 python3.9[92324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:08 compute-0 sudo[92322]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:09 compute-0 sudo[92400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljouchjqinenjmdubqhdzjorpefbphgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349988.3158417-837-227162160349731/AnsiballZ_file.py'
Nov 28 17:13:09 compute-0 sudo[92400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:09 compute-0 python3.9[92402]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:09 compute-0 sudo[92400]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:09 compute-0 sudo[92552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exkqbmlcfomaiyxccwwojxtygvhtfgvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349989.460274-861-181795210019929/AnsiballZ_stat.py'
Nov 28 17:13:09 compute-0 sudo[92552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:09 compute-0 python3.9[92554]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:09 compute-0 sudo[92552]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:10 compute-0 sudo[92630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giwpujbikmmtfywnoxdipbpcwbzchizw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349989.460274-861-181795210019929/AnsiballZ_file.py'
Nov 28 17:13:10 compute-0 sudo[92630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:10 compute-0 python3.9[92632]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:10 compute-0 sudo[92630]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:11 compute-0 sudo[92782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-invhopiajtynpqcjpwhjebtrqmnmpxvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349990.6401331-885-111167338305924/AnsiballZ_systemd.py'
Nov 28 17:13:11 compute-0 sudo[92782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:11 compute-0 python3.9[92784]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:13:11 compute-0 systemd[1]: Reloading.
Nov 28 17:13:11 compute-0 systemd-sysv-generator[92812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:13:11 compute-0 systemd-rc-local-generator[92806]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:13:11 compute-0 systemd[1]: Starting Create netns directory...
Nov 28 17:13:11 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 17:13:11 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 17:13:11 compute-0 systemd[1]: Finished Create netns directory.
Nov 28 17:13:11 compute-0 sudo[92782]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:12 compute-0 sudo[92975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbbdlcivvpoclngvoaxectclwisrczzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349992.0262914-905-248546732323907/AnsiballZ_file.py'
Nov 28 17:13:12 compute-0 sudo[92975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:12 compute-0 python3.9[92977]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:12 compute-0 sudo[92975]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:13 compute-0 sudo[93127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzhyftbewsorwpeipxwlugqgtievywfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349992.7606747-921-214597281975788/AnsiballZ_stat.py'
Nov 28 17:13:13 compute-0 sudo[93127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:13 compute-0 python3.9[93129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:13 compute-0 sudo[93127]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:13 compute-0 sudo[93250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymtpmxpjqbcwkyndgjsecexnqodlrnta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349992.7606747-921-214597281975788/AnsiballZ_copy.py'
Nov 28 17:13:13 compute-0 sudo[93250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:13 compute-0 python3.9[93252]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764349992.7606747-921-214597281975788/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:13 compute-0 sudo[93250]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:14 compute-0 sudo[93402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inojsxwgaxqdxxbsrmzzwctfbtcaxqnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349994.3128226-955-117423795184324/AnsiballZ_file.py'
Nov 28 17:13:14 compute-0 sudo[93402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:14 compute-0 python3.9[93404]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:14 compute-0 sudo[93402]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:15 compute-0 sudo[93554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnukvpvlwsjblqwkntueghgkfwrgvdrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349995.2388802-971-138708956029884/AnsiballZ_stat.py'
Nov 28 17:13:15 compute-0 sudo[93554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:15 compute-0 python3.9[93556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:15 compute-0 sudo[93554]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:16 compute-0 sudo[93677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxqfnhwbyjaxbaetjnozienytuffyxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349995.2388802-971-138708956029884/AnsiballZ_copy.py'
Nov 28 17:13:16 compute-0 sudo[93677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:16 compute-0 python3.9[93679]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764349995.2388802-971-138708956029884/.source.json _original_basename=.oz3_2f32 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:16 compute-0 sudo[93677]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:16 compute-0 sudo[93829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnbvdsyqyvuxwoocotylesqwwrxhziom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349996.572048-1001-54423556357481/AnsiballZ_file.py'
Nov 28 17:13:16 compute-0 sudo[93829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:17 compute-0 python3.9[93831]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:17 compute-0 sudo[93829]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:17 compute-0 sudo[93981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyohikbxnzkpbvkjdrxpkejtdofqwwml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349997.2682126-1017-57003352639984/AnsiballZ_stat.py'
Nov 28 17:13:17 compute-0 sudo[93981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:17 compute-0 sudo[93981]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:18 compute-0 sudo[94104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zndkeoagnpwncqxdutquptnfrkgkzopq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349997.2682126-1017-57003352639984/AnsiballZ_copy.py'
Nov 28 17:13:18 compute-0 sudo[94104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:18 compute-0 sudo[94104]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:19 compute-0 sudo[94256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddxnpfcxxgmcsuzvuzevokhlkstveima ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764349998.9941502-1051-129843558473959/AnsiballZ_container_config_data.py'
Nov 28 17:13:19 compute-0 sudo[94256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:19 compute-0 python3.9[94258]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 28 17:13:19 compute-0 sudo[94256]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:20 compute-0 sudo[94408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkavbsggnzxylgstojalnigcdfyfzyyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350000.116596-1069-253020444438765/AnsiballZ_container_config_hash.py'
Nov 28 17:13:20 compute-0 sudo[94408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:20 compute-0 python3.9[94410]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:13:20 compute-0 sudo[94408]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:21 compute-0 sudo[94560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzojabxhmyntzypbenrpgwfncqobrvmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350001.0791576-1087-51177334353549/AnsiballZ_podman_container_info.py'
Nov 28 17:13:21 compute-0 sudo[94560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:21 compute-0 python3.9[94562]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 17:13:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:13:21 compute-0 sudo[94560]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:22 compute-0 sudo[94723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-londkcqcbyndqtqugszqdaxiapymrnku ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350002.3811798-1113-99116525000643/AnsiballZ_edpm_container_manage.py'
Nov 28 17:13:22 compute-0 sudo[94723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:23 compute-0 python3[94725]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:13:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:13:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:13:23 compute-0 podman[94762]: 2025-11-28 17:13:23.303853785 +0000 UTC m=+0.044460989 container create 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:13:23 compute-0 podman[94762]: 2025-11-28 17:13:23.28087861 +0000 UTC m=+0.021485814 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 17:13:23 compute-0 python3[94725]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 17:13:23 compute-0 sudo[94723]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:24 compute-0 sudo[94949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cycxyzkwwrwangrdqfankjisiwrwjihf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350003.7679644-1129-295099622422/AnsiballZ_stat.py'
Nov 28 17:13:24 compute-0 sudo[94949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:24 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 17:13:24 compute-0 python3.9[94951]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:13:24 compute-0 sudo[94949]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:25 compute-0 sudo[95103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhojqnvkasbgsrobtmzhqqcczagodkma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350004.7511785-1147-113145457892654/AnsiballZ_file.py'
Nov 28 17:13:25 compute-0 sudo[95103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:25 compute-0 python3.9[95105]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:25 compute-0 sudo[95103]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:25 compute-0 sudo[95179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azsjpkwxtirtzheivvqnyfekwvlmbjlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350004.7511785-1147-113145457892654/AnsiballZ_stat.py'
Nov 28 17:13:25 compute-0 sudo[95179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:25 compute-0 python3.9[95181]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:13:25 compute-0 sudo[95179]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:26 compute-0 sudo[95330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iezaxhkkgjgkguruzkfwlvfrdlyybrdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350005.8603642-1147-216349039318411/AnsiballZ_copy.py'
Nov 28 17:13:26 compute-0 sudo[95330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:26 compute-0 python3.9[95332]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350005.8603642-1147-216349039318411/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:13:26 compute-0 sudo[95330]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:26 compute-0 sudo[95406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cextmubcwtlwphjxmsveesvlzmlfajry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350005.8603642-1147-216349039318411/AnsiballZ_systemd.py'
Nov 28 17:13:26 compute-0 sudo[95406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:27 compute-0 python3.9[95408]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:13:27 compute-0 systemd[1]: Reloading.
Nov 28 17:13:27 compute-0 systemd-rc-local-generator[95431]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:13:27 compute-0 systemd-sysv-generator[95435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:13:27 compute-0 sudo[95406]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:27 compute-0 sudo[95517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuuttvuuefkwrccnbszaxfqrbojyuizy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350005.8603642-1147-216349039318411/AnsiballZ_systemd.py'
Nov 28 17:13:27 compute-0 sudo[95517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:27 compute-0 python3.9[95519]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:13:27 compute-0 systemd[1]: Reloading.
Nov 28 17:13:27 compute-0 systemd-rc-local-generator[95548]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:13:27 compute-0 systemd-sysv-generator[95551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:13:28 compute-0 systemd[1]: Starting ovn_controller container...
Nov 28 17:13:28 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 28 17:13:28 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ea4df85b3acd66284a5bc02b37d772e2548064716e0f05a518c6ad794f1de51/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 17:13:28 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc.
Nov 28 17:13:28 compute-0 podman[95559]: 2025-11-28 17:13:28.273463802 +0000 UTC m=+0.126801976 container init 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + sudo -E kolla_set_configs
Nov 28 17:13:28 compute-0 podman[95559]: 2025-11-28 17:13:28.299846165 +0000 UTC m=+0.153184319 container start 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:13:28 compute-0 edpm-start-podman-container[95559]: ovn_controller
Nov 28 17:13:28 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 28 17:13:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 17:13:28 compute-0 edpm-start-podman-container[95558]: Creating additional drop-in dependency for "ovn_controller" (28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc)
Nov 28 17:13:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 17:13:28 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 28 17:13:28 compute-0 podman[95580]: 2025-11-28 17:13:28.387515044 +0000 UTC m=+0.075942636 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:13:28 compute-0 systemd[1]: Reloading.
Nov 28 17:13:28 compute-0 systemd-sysv-generator[95652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:13:28 compute-0 systemd-rc-local-generator[95648]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:13:28 compute-0 systemd[1]: Started ovn_controller container.
Nov 28 17:13:28 compute-0 systemd[1]: 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc-3dc1936d5a2d97aa.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 17:13:28 compute-0 systemd[1]: 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc-3dc1936d5a2d97aa.service: Failed with result 'exit-code'.
Nov 28 17:13:28 compute-0 systemd[95616]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 28 17:13:28 compute-0 sudo[95517]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:28 compute-0 systemd[95616]: Queued start job for default target Main User Target.
Nov 28 17:13:28 compute-0 systemd[95616]: Created slice User Application Slice.
Nov 28 17:13:28 compute-0 systemd[95616]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 17:13:28 compute-0 systemd[95616]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:13:28 compute-0 systemd[95616]: Reached target Paths.
Nov 28 17:13:28 compute-0 systemd[95616]: Reached target Timers.
Nov 28 17:13:28 compute-0 systemd[95616]: Starting D-Bus User Message Bus Socket...
Nov 28 17:13:28 compute-0 systemd[95616]: Starting Create User's Volatile Files and Directories...
Nov 28 17:13:28 compute-0 systemd[95616]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:13:28 compute-0 systemd[95616]: Reached target Sockets.
Nov 28 17:13:28 compute-0 systemd[95616]: Finished Create User's Volatile Files and Directories.
Nov 28 17:13:28 compute-0 systemd[95616]: Reached target Basic System.
Nov 28 17:13:28 compute-0 systemd[95616]: Reached target Main User Target.
Nov 28 17:13:28 compute-0 systemd[95616]: Startup finished in 114ms.
Nov 28 17:13:28 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 28 17:13:28 compute-0 systemd[1]: Started Session c1 of User root.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:13:28 compute-0 ovn_controller[95574]: INFO:__main__:Validating config file
Nov 28 17:13:28 compute-0 ovn_controller[95574]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:13:28 compute-0 ovn_controller[95574]: INFO:__main__:Writing out command to execute
Nov 28 17:13:28 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: ++ cat /run_command
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + ARGS=
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + sudo kolla_copy_cacerts
Nov 28 17:13:28 compute-0 systemd[1]: Started Session c2 of User root.
Nov 28 17:13:28 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + [[ ! -n '' ]]
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + . kolla_extend_start
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 28 17:13:28 compute-0 ovn_controller[95574]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + umask 0022
Nov 28 17:13:28 compute-0 ovn_controller[95574]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9118] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9127] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9139] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9144] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9147] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 17:13:28 compute-0 kernel: br-int: entered promiscuous mode
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 17:13:28 compute-0 ovn_controller[95574]: 2025-11-28T17:13:28Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9290] manager: (ovn-ff0c0f-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9295] manager: (ovn-01f1e5-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 28 17:13:28 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9481] device (genev_sys_6081): carrier: link connected
Nov 28 17:13:28 compute-0 NetworkManager[55763]: <info>  [1764350008.9483] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 28 17:13:28 compute-0 systemd-udevd[95781]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:13:28 compute-0 systemd-udevd[95782]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:13:29 compute-0 sudo[95837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfnsdgoxfarctmvnrofgipuyonjdesdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350008.8463366-1203-35928197935862/AnsiballZ_command.py'
Nov 28 17:13:29 compute-0 sudo[95837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:29 compute-0 python3.9[95839]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:13:29 compute-0 ovs-vsctl[95840]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 28 17:13:29 compute-0 sudo[95837]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:29 compute-0 sudo[95990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmvpkztpyyrruwxwexpkruvhvbtzuru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350009.636216-1219-239374324868174/AnsiballZ_command.py'
Nov 28 17:13:29 compute-0 sudo[95990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:30 compute-0 python3.9[95992]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:13:30 compute-0 ovs-vsctl[95994]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 28 17:13:30 compute-0 sudo[95990]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:30 compute-0 sudo[96145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jelioncqsnztecujnxokunqlsqribjrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350010.7341707-1247-258194042107340/AnsiballZ_command.py'
Nov 28 17:13:30 compute-0 sudo[96145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:31 compute-0 python3.9[96147]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:13:31 compute-0 ovs-vsctl[96148]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 28 17:13:31 compute-0 sudo[96145]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:31 compute-0 sshd-session[85082]: Connection closed by 192.168.122.30 port 59564
Nov 28 17:13:31 compute-0 sshd-session[85079]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:13:31 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Nov 28 17:13:31 compute-0 systemd[1]: session-20.scope: Consumed 44.571s CPU time.
Nov 28 17:13:31 compute-0 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Nov 28 17:13:31 compute-0 systemd-logind[788]: Removed session 20.
Nov 28 17:13:37 compute-0 sshd-session[96173]: Accepted publickey for zuul from 192.168.122.30 port 58188 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:13:37 compute-0 systemd-logind[788]: New session 22 of user zuul.
Nov 28 17:13:37 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 28 17:13:37 compute-0 sshd-session[96173]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:13:38 compute-0 python3.9[96326]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:13:39 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 28 17:13:39 compute-0 systemd[95616]: Activating special unit Exit the Session...
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped target Main User Target.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped target Basic System.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped target Paths.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped target Sockets.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped target Timers.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:13:39 compute-0 systemd[95616]: Closed D-Bus User Message Bus Socket.
Nov 28 17:13:39 compute-0 systemd[95616]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:13:39 compute-0 systemd[95616]: Removed slice User Application Slice.
Nov 28 17:13:39 compute-0 systemd[95616]: Reached target Shutdown.
Nov 28 17:13:39 compute-0 systemd[95616]: Finished Exit the Session.
Nov 28 17:13:39 compute-0 systemd[95616]: Reached target Exit the Session.
Nov 28 17:13:39 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 28 17:13:39 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 28 17:13:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 17:13:39 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 17:13:39 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 17:13:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 17:13:39 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 28 17:13:40 compute-0 sudo[96483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opjhevgerbxkyvvhdvapxjehchhnwise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350019.3706527-53-149095128469705/AnsiballZ_file.py'
Nov 28 17:13:40 compute-0 sudo[96483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:40 compute-0 python3.9[96485]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:40 compute-0 sudo[96483]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:40 compute-0 sudo[96635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulernqujihtqxksbnwizsrodgedvwyhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350020.612088-53-22731365933925/AnsiballZ_file.py'
Nov 28 17:13:40 compute-0 sudo[96635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:41 compute-0 python3.9[96637]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:41 compute-0 sudo[96635]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:41 compute-0 sudo[96787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtcacdklmkhvjdqpwlidcldsgdsjrinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350021.2571664-53-10788420683555/AnsiballZ_file.py'
Nov 28 17:13:41 compute-0 sudo[96787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:41 compute-0 python3.9[96789]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:41 compute-0 sudo[96787]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:42 compute-0 sudo[96939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vstjncfscyausdjmroehoylppugslkbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350022.075669-53-269303202720242/AnsiballZ_file.py'
Nov 28 17:13:42 compute-0 sudo[96939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:42 compute-0 python3.9[96941]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:42 compute-0 sudo[96939]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:42 compute-0 sudo[97091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxxxjnqmabnbyuvnrgserjvpuueuenpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350022.6672144-53-135421111206166/AnsiballZ_file.py'
Nov 28 17:13:42 compute-0 sudo[97091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:43 compute-0 python3.9[97093]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:43 compute-0 sudo[97091]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:43 compute-0 python3.9[97243]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:13:44 compute-0 sudo[97393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvkbliozzooorpvakflincpmjeyfiqnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350024.1676981-141-139289317715416/AnsiballZ_seboolean.py'
Nov 28 17:13:44 compute-0 sudo[97393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:44 compute-0 python3.9[97395]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 17:13:45 compute-0 sudo[97393]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:46 compute-0 python3.9[97545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:47 compute-0 python3.9[97666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350025.7719014-157-50295572016309/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:48 compute-0 python3.9[97817]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:48 compute-0 python3.9[97938]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350027.656907-187-274093781966320/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:49 compute-0 sudo[98088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbkkrafjnlykeuszhycrgsupcitztglg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350029.0146043-221-67826773227231/AnsiballZ_setup.py'
Nov 28 17:13:49 compute-0 sudo[98088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:49 compute-0 python3.9[98090]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:13:49 compute-0 sudo[98088]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:50 compute-0 sudo[98172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yissxfobydjajkksqmncbolhptmjpoyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350029.0146043-221-67826773227231/AnsiballZ_dnf.py'
Nov 28 17:13:50 compute-0 sudo[98172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:50 compute-0 python3.9[98174]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:13:51 compute-0 sudo[98172]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:52 compute-0 sudo[98325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxomvfspugvgdpaddzxlppxwcnzkjqvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350032.2456381-245-131287745882585/AnsiballZ_systemd.py'
Nov 28 17:13:52 compute-0 sudo[98325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:53 compute-0 python3.9[98327]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:13:53 compute-0 sudo[98325]: pam_unix(sudo:session): session closed for user root
Nov 28 17:13:53 compute-0 python3.9[98480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:54 compute-0 python3.9[98601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350033.4364953-261-197371063597782/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:55 compute-0 python3.9[98751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:55 compute-0 python3.9[98872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350034.5982728-261-166882824818449/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:56 compute-0 python3.9[99022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:57 compute-0 python3.9[99143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350036.328122-349-209767663173090/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:57 compute-0 python3.9[99293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:13:58 compute-0 python3.9[99414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350037.4458795-349-171443759854216/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:58 compute-0 ovn_controller[95574]: 2025-11-28T17:13:58Z|00025|memory|INFO|16000 kB peak resident set size after 30.1 seconds
Nov 28 17:13:58 compute-0 ovn_controller[95574]: 2025-11-28T17:13:58Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Nov 28 17:13:58 compute-0 podman[99538]: 2025-11-28 17:13:58.990828775 +0000 UTC m=+0.130880053 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 17:13:59 compute-0 python3.9[99575]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:13:59 compute-0 sudo[99744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxbmetabbdccyltluxizthnozekkmlvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350039.3121881-425-34314972805643/AnsiballZ_file.py'
Nov 28 17:13:59 compute-0 sudo[99744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:13:59 compute-0 python3.9[99746]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:13:59 compute-0 sudo[99744]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:00 compute-0 sudo[99896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzrhoetebffhvvtsygdkensfcviarunx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350040.0013125-441-198384268069625/AnsiballZ_stat.py'
Nov 28 17:14:00 compute-0 sudo[99896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:00 compute-0 python3.9[99898]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:00 compute-0 sudo[99896]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:00 compute-0 sudo[99974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csomxangqqupfnoisdwtwiimcqjhqyle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350040.0013125-441-198384268069625/AnsiballZ_file.py'
Nov 28 17:14:00 compute-0 sudo[99974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:01 compute-0 python3.9[99976]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:14:01 compute-0 sudo[99974]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:01 compute-0 sudo[100126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prjqwbtuvtrwiovlpcbdpmzrfbblfklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350041.1802325-441-120085545242397/AnsiballZ_stat.py'
Nov 28 17:14:01 compute-0 sudo[100126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:01 compute-0 python3.9[100128]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:01 compute-0 sudo[100126]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:01 compute-0 sudo[100204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvqjfpuoedtrwseevudggsvazdnoxvmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350041.1802325-441-120085545242397/AnsiballZ_file.py'
Nov 28 17:14:01 compute-0 sudo[100204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:02 compute-0 python3.9[100206]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:14:02 compute-0 sudo[100204]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:02 compute-0 sudo[100356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixiirjtyndsqdxkmuvlwsdwblgqwdzpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350042.362071-487-184547030605601/AnsiballZ_file.py'
Nov 28 17:14:02 compute-0 sudo[100356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:02 compute-0 python3.9[100358]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:02 compute-0 sudo[100356]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:03 compute-0 sudo[100508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pigcyntelnhylyygqveumqpaffpifbet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350043.0571477-503-220266082526076/AnsiballZ_stat.py'
Nov 28 17:14:03 compute-0 sudo[100508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:03 compute-0 python3.9[100510]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:03 compute-0 sudo[100508]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:03 compute-0 sudo[100586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inyrjkrtcofhyzalcfzxzwygjdtfnsqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350043.0571477-503-220266082526076/AnsiballZ_file.py'
Nov 28 17:14:03 compute-0 sudo[100586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:04 compute-0 python3.9[100588]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:04 compute-0 sudo[100586]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:04 compute-0 sudo[100738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzgaogpehyiowxemckdgfztsookijcak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350044.2124994-527-193272638890027/AnsiballZ_stat.py'
Nov 28 17:14:04 compute-0 sudo[100738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:04 compute-0 python3.9[100740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:04 compute-0 sudo[100738]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:04 compute-0 sudo[100816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-widzlvdhgkutevcefqfveshyqvwcoaqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350044.2124994-527-193272638890027/AnsiballZ_file.py'
Nov 28 17:14:04 compute-0 sudo[100816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:05 compute-0 python3.9[100818]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:05 compute-0 sudo[100816]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:05 compute-0 sudo[100968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghuhocjgvzctouwadksqrdhcjqdzayx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350045.410365-551-180602245894910/AnsiballZ_systemd.py'
Nov 28 17:14:05 compute-0 sudo[100968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:05 compute-0 python3.9[100970]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:06 compute-0 systemd[1]: Reloading.
Nov 28 17:14:06 compute-0 systemd-rc-local-generator[100995]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:06 compute-0 systemd-sysv-generator[100999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:06 compute-0 sudo[100968]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:06 compute-0 sudo[101157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wotkklxsvntzhpcttovdavemclwalbhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350046.4609826-567-116522090745625/AnsiballZ_stat.py'
Nov 28 17:14:06 compute-0 sudo[101157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:06 compute-0 python3.9[101159]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:06 compute-0 sudo[101157]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:07 compute-0 sudo[101235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcocpaodjwsloxiajjcminnryuzqlytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350046.4609826-567-116522090745625/AnsiballZ_file.py'
Nov 28 17:14:07 compute-0 sudo[101235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:07 compute-0 python3.9[101237]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:07 compute-0 sudo[101235]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:07 compute-0 sudo[101387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siazvkwkczheqvgbbiuptqgvjsslalhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350047.6437285-591-210006634621354/AnsiballZ_stat.py'
Nov 28 17:14:07 compute-0 sudo[101387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:08 compute-0 python3.9[101389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:08 compute-0 sudo[101387]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:08 compute-0 sudo[101465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcatdvechxgtmcjhnwnxvcbicyddldot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350047.6437285-591-210006634621354/AnsiballZ_file.py'
Nov 28 17:14:08 compute-0 sudo[101465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:08 compute-0 python3.9[101467]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:08 compute-0 sudo[101465]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:09 compute-0 sudo[101617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geslifpoqzvolgfdnzgfofuthaehslmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350048.749328-615-185129264251525/AnsiballZ_systemd.py'
Nov 28 17:14:09 compute-0 sudo[101617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:09 compute-0 python3.9[101619]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:09 compute-0 systemd[1]: Reloading.
Nov 28 17:14:09 compute-0 systemd-sysv-generator[101650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:09 compute-0 systemd-rc-local-generator[101647]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:09 compute-0 systemd[1]: Starting Create netns directory...
Nov 28 17:14:09 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 17:14:09 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 17:14:09 compute-0 systemd[1]: Finished Create netns directory.
Nov 28 17:14:09 compute-0 sudo[101617]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:10 compute-0 sudo[101811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bynjrjhrfageocvsocnpppfhchkhrsuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350050.0395594-635-157426888101389/AnsiballZ_file.py'
Nov 28 17:14:10 compute-0 sudo[101811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:10 compute-0 python3.9[101813]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:14:10 compute-0 sudo[101811]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:11 compute-0 sudo[101963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsjejvjjwczycjytklzjdvytqzfslmjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350050.7493823-651-241072136527230/AnsiballZ_stat.py'
Nov 28 17:14:11 compute-0 sudo[101963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:11 compute-0 python3.9[101965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:11 compute-0 sudo[101963]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:11 compute-0 sudo[102086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azkgdqguuhkozbmajlnkbispllzezzuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350050.7493823-651-241072136527230/AnsiballZ_copy.py'
Nov 28 17:14:11 compute-0 sudo[102086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:11 compute-0 python3.9[102088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350050.7493823-651-241072136527230/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:14:11 compute-0 sudo[102086]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:12 compute-0 sudo[102238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbftqkmxkubbzejndjahczvgnekwkvmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350052.2160504-685-153585824337566/AnsiballZ_file.py'
Nov 28 17:14:12 compute-0 sudo[102238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:12 compute-0 python3.9[102240]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:14:12 compute-0 sudo[102238]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:13 compute-0 sudo[102390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsvqmhefrrgyijaehpwlunvkrqpuukzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350052.964224-701-116908512400863/AnsiballZ_stat.py'
Nov 28 17:14:13 compute-0 sudo[102390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:13 compute-0 python3.9[102392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:14:13 compute-0 sudo[102390]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:13 compute-0 sudo[102513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnbqciczlsrsqkdlrceecmnzbtvdhzwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350052.964224-701-116908512400863/AnsiballZ_copy.py'
Nov 28 17:14:13 compute-0 sudo[102513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:13 compute-0 python3.9[102515]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350052.964224-701-116908512400863/.source.json _original_basename=.dgju21mu follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:13 compute-0 sudo[102513]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:14 compute-0 sudo[102665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetvrrgdsthmilitrumephmnskavasoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350054.327378-731-146130361513397/AnsiballZ_file.py'
Nov 28 17:14:14 compute-0 sudo[102665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:14 compute-0 python3.9[102667]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:14 compute-0 sudo[102665]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:15 compute-0 sudo[102817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwffalfhhhntyqpseczoaznwukcmepyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350055.1329818-747-158537737045234/AnsiballZ_stat.py'
Nov 28 17:14:15 compute-0 sudo[102817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:15 compute-0 sudo[102817]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:15 compute-0 sudo[102940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmjdgpoxfgyrntajxkiwhfshncdvmss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350055.1329818-747-158537737045234/AnsiballZ_copy.py'
Nov 28 17:14:15 compute-0 sudo[102940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:16 compute-0 sudo[102940]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:17 compute-0 sudo[103092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcnrijlwnlpysoalyvkulixribkyaegn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350056.5940852-781-115845101808578/AnsiballZ_container_config_data.py'
Nov 28 17:14:17 compute-0 sudo[103092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:17 compute-0 python3.9[103094]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 28 17:14:17 compute-0 sudo[103092]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:17 compute-0 sudo[103244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omnjxwrukfbxpunvhtrucllsekzunrie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350057.5326002-799-122971320262357/AnsiballZ_container_config_hash.py'
Nov 28 17:14:17 compute-0 sudo[103244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:18 compute-0 python3.9[103246]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:14:18 compute-0 sudo[103244]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:18 compute-0 sudo[103396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqgmtleudzatlvybjhtfzysvdxvxticq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350058.447673-817-80087376510275/AnsiballZ_podman_container_info.py'
Nov 28 17:14:18 compute-0 sudo[103396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:19 compute-0 python3.9[103398]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 17:14:19 compute-0 sudo[103396]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:20 compute-0 sudo[103573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apsuiwxkvciaruhrkvwpphuibhczttwk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350059.8769443-843-53844794670862/AnsiballZ_edpm_container_manage.py'
Nov 28 17:14:20 compute-0 sudo[103573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:20 compute-0 python3[103575]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:14:20 compute-0 podman[103614]: 2025-11-28 17:14:20.903864417 +0000 UTC m=+0.056259270 container create d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 17:14:20 compute-0 podman[103614]: 2025-11-28 17:14:20.876011121 +0000 UTC m=+0.028405884 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:14:20 compute-0 python3[103575]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:14:21 compute-0 sudo[103573]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:21 compute-0 sudo[103802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zukuvvoxzxdowclhrvlchfrklgrgxjqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350061.290267-859-40835109896637/AnsiballZ_stat.py'
Nov 28 17:14:21 compute-0 sudo[103802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:21 compute-0 python3.9[103804]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:14:21 compute-0 sudo[103802]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:22 compute-0 sudo[103956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koumfrjlkvaipvziwtwwusrfqjkgftmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350062.1707175-877-140193271724778/AnsiballZ_file.py'
Nov 28 17:14:22 compute-0 sudo[103956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:22 compute-0 python3.9[103958]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:22 compute-0 sudo[103956]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:22 compute-0 sudo[104032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsdmkbqcqdojuckfrclmkvapasapuknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350062.1707175-877-140193271724778/AnsiballZ_stat.py'
Nov 28 17:14:22 compute-0 sudo[104032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:23 compute-0 python3.9[104034]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:14:23 compute-0 sudo[104032]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:23 compute-0 sudo[104183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egzxoeizgwbvsprqfdbkrzgoenvdelsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350063.1067445-877-202789215763421/AnsiballZ_copy.py'
Nov 28 17:14:23 compute-0 sudo[104183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:23 compute-0 python3.9[104185]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350063.1067445-877-202789215763421/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:23 compute-0 sudo[104183]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:24 compute-0 sudo[104259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mishwygzsvojzdzrbzsepbgrdsptkmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350063.1067445-877-202789215763421/AnsiballZ_systemd.py'
Nov 28 17:14:24 compute-0 sudo[104259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:24 compute-0 python3.9[104261]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:14:24 compute-0 systemd[1]: Reloading.
Nov 28 17:14:24 compute-0 systemd-sysv-generator[104291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:24 compute-0 systemd-rc-local-generator[104288]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:24 compute-0 sudo[104259]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:24 compute-0 sudo[104369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilkwjwjalncmrxttrkztosotdibjyalp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350063.1067445-877-202789215763421/AnsiballZ_systemd.py'
Nov 28 17:14:24 compute-0 sudo[104369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:25 compute-0 python3.9[104371]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:25 compute-0 systemd[1]: Reloading.
Nov 28 17:14:25 compute-0 systemd-sysv-generator[104401]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:25 compute-0 systemd-rc-local-generator[104396]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:25 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 28 17:14:25 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:14:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccaa019ca7d2285b1c4b8f8eb8a0eec00feaa9b8d0fbdea9d1238430e9a66652/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 17:14:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccaa019ca7d2285b1c4b8f8eb8a0eec00feaa9b8d0fbdea9d1238430e9a66652/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:14:25 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1.
Nov 28 17:14:25 compute-0 podman[104412]: 2025-11-28 17:14:25.527122127 +0000 UTC m=+0.158412499 container init d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + sudo -E kolla_set_configs
Nov 28 17:14:25 compute-0 podman[104412]: 2025-11-28 17:14:25.558355461 +0000 UTC m=+0.189645783 container start d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:14:25 compute-0 edpm-start-podman-container[104412]: ovn_metadata_agent
Nov 28 17:14:25 compute-0 edpm-start-podman-container[104411]: Creating additional drop-in dependency for "ovn_metadata_agent" (d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1)
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Validating config file
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Copying service configuration files
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Writing out command to execute
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: ++ cat /run_command
Nov 28 17:14:25 compute-0 systemd[1]: Reloading.
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + CMD=neutron-ovn-metadata-agent
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + ARGS=
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + sudo kolla_copy_cacerts
Nov 28 17:14:25 compute-0 podman[104434]: 2025-11-28 17:14:25.662496257 +0000 UTC m=+0.089748590 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + [[ ! -n '' ]]
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + . kolla_extend_start
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: Running command: 'neutron-ovn-metadata-agent'
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + umask 0022
Nov 28 17:14:25 compute-0 ovn_metadata_agent[104428]: + exec neutron-ovn-metadata-agent
Nov 28 17:14:25 compute-0 systemd-sysv-generator[104509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:25 compute-0 systemd-rc-local-generator[104506]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:25 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 28 17:14:25 compute-0 sudo[104369]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:26 compute-0 sshd-session[96176]: Connection closed by 192.168.122.30 port 58188
Nov 28 17:14:26 compute-0 sshd-session[96173]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:14:26 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 28 17:14:26 compute-0 systemd[1]: session-22.scope: Consumed 34.026s CPU time.
Nov 28 17:14:26 compute-0 systemd-logind[788]: Session 22 logged out. Waiting for processes to exit.
Nov 28 17:14:26 compute-0 systemd-logind[788]: Removed session 22.
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.599 104433 INFO neutron.common.config [-] Logging enabled!
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.600 104433 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.600 104433 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.600 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.601 104433 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.602 104433 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.603 104433 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.604 104433 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.605 104433 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.606 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.607 104433 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.608 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.609 104433 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.610 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.611 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.612 104433 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.613 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.614 104433 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.615 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.616 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.617 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.618 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.619 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.620 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.621 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.622 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.623 104433 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.624 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.625 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.626 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.627 104433 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.628 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.629 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.630 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.631 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.632 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.633 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.634 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.635 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.636 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.637 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.638 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.639 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.640 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.641 104433 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.652 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.653 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.653 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.653 104433 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.654 104433 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.668 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2ad2dbac-a967-40fb-b69b-7c374c5f8e9d (UUID: 2ad2dbac-a967-40fb-b69b-7c374c5f8e9d) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.697 104433 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.697 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.697 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.697 104433 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.701 104433 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.708 104433 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.715 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2ad2dbac-a967-40fb-b69b-7c374c5f8e9d'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], external_ids={}, name=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, nb_cfg_timestamp=1764350016925, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.716 104433 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fda326f8b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.716 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.717 104433 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.717 104433 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.717 104433 INFO oslo_service.service [-] Starting 1 workers
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.726 104433 DEBUG oslo_service.service [-] Started child 104541 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.730 104541 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1020829'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.730 104433 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp_m6g8bg_/privsep.sock']
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.761 104541 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.761 104541 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.761 104541 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.764 104541 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.771 104541 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 17:14:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:27.778 104541 INFO eventlet.wsgi.server [-] (104541) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 28 17:14:28 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.383 104433 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.384 104433 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_m6g8bg_/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.276 104546 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.281 104546 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.283 104546 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.283 104546 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104546
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.386 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[371780d3-014d-4808-b8fa-7429d4cb6dbb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.845 104546 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.845 104546 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:14:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:28.845 104546 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:14:29 compute-0 podman[104551]: 2025-11-28 17:14:29.243626107 +0000 UTC m=+0.109065950 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.360 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b2386d-138a-487f-8bdd-e9acf16702a6]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.363 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, column=external_ids, values=({'neutron:ovn-metadata-id': '012c8ef5-b8dc-5906-a980-82db5a57060e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.372 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.378 104433 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.378 104433 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.379 104433 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.380 104433 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.381 104433 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.382 104433 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.383 104433 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.384 104433 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.385 104433 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.386 104433 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.387 104433 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.388 104433 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.389 104433 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.390 104433 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.391 104433 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.392 104433 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.393 104433 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.394 104433 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.395 104433 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.396 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.397 104433 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.398 104433 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.399 104433 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.400 104433 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.401 104433 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.402 104433 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.403 104433 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.404 104433 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.405 104433 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.406 104433 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.407 104433 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.408 104433 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.409 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.410 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.411 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.412 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:14:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:14:29.413 104433 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 17:14:32 compute-0 sshd-session[104578]: Accepted publickey for zuul from 192.168.122.30 port 44296 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:14:32 compute-0 systemd-logind[788]: New session 23 of user zuul.
Nov 28 17:14:32 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 28 17:14:32 compute-0 sshd-session[104578]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:14:33 compute-0 python3.9[104731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:14:34 compute-0 sudo[104885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpseyftpylrybdmiqapitasdeipsofqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350073.6649113-52-136893122025998/AnsiballZ_command.py'
Nov 28 17:14:34 compute-0 sudo[104885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:34 compute-0 python3.9[104887]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:34 compute-0 sudo[104885]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:35 compute-0 sudo[105050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqeryfvmagfceqmjdgjrbcmrxijzqgnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350074.7029326-74-211170603929799/AnsiballZ_systemd_service.py'
Nov 28 17:14:35 compute-0 sudo[105050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:35 compute-0 python3.9[105052]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:14:35 compute-0 systemd[1]: Reloading.
Nov 28 17:14:35 compute-0 systemd-rc-local-generator[105080]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:35 compute-0 systemd-sysv-generator[105084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:35 compute-0 sudo[105050]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:36 compute-0 python3.9[105237]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:14:36 compute-0 network[105254]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:14:36 compute-0 network[105255]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:14:36 compute-0 network[105256]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:14:39 compute-0 sudo[105515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orduqqlkwlvxgjsglxjuhgndzubtbdld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350079.62731-112-258761603018686/AnsiballZ_systemd_service.py'
Nov 28 17:14:39 compute-0 sudo[105515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:40 compute-0 python3.9[105517]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:40 compute-0 sudo[105515]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:40 compute-0 sudo[105668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwyumbdareqtsplafkdhnhijkkrkeqld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350080.4382725-112-132691623830363/AnsiballZ_systemd_service.py'
Nov 28 17:14:40 compute-0 sudo[105668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:41 compute-0 python3.9[105670]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:41 compute-0 sudo[105668]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:41 compute-0 sudo[105821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnwlckepdxnalrxqlhpvkiiclavyynyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350081.1935623-112-272429203924009/AnsiballZ_systemd_service.py'
Nov 28 17:14:41 compute-0 sudo[105821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:41 compute-0 python3.9[105823]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:41 compute-0 sudo[105821]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:42 compute-0 sudo[105974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjbgvegayutsgbrgeeoivwapztruimsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350081.9585414-112-34478050159668/AnsiballZ_systemd_service.py'
Nov 28 17:14:42 compute-0 sudo[105974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:42 compute-0 python3.9[105976]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:42 compute-0 sudo[105974]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:42 compute-0 sudo[106127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzksdvduljglymmzlzhdzxsomudmaov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350082.720205-112-23698648358893/AnsiballZ_systemd_service.py'
Nov 28 17:14:43 compute-0 sudo[106127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:43 compute-0 python3.9[106129]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:43 compute-0 sudo[106127]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:43 compute-0 sudo[106280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwyjzxgbvecqxivfsiborooqlvubdbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350083.4630554-112-71604670083047/AnsiballZ_systemd_service.py'
Nov 28 17:14:43 compute-0 sudo[106280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:44 compute-0 python3.9[106282]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:44 compute-0 sudo[106280]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:44 compute-0 sudo[106433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mairqvaxydxycsthhkxhsrizwyhudcze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350084.2160702-112-121589572773931/AnsiballZ_systemd_service.py'
Nov 28 17:14:44 compute-0 sudo[106433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:44 compute-0 python3.9[106435]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:14:44 compute-0 sudo[106433]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:45 compute-0 sudo[106586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtimkqbjbeebmxicwtrvcgowzctqfxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350085.2117114-216-218563849117356/AnsiballZ_file.py'
Nov 28 17:14:45 compute-0 sudo[106586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:45 compute-0 python3.9[106588]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:45 compute-0 sudo[106586]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:46 compute-0 sudo[106738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idudipgnzurejuelklekyqmmygcgscgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350086.0507174-216-26324081769604/AnsiballZ_file.py'
Nov 28 17:14:46 compute-0 sudo[106738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:46 compute-0 python3.9[106740]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:46 compute-0 sudo[106738]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:47 compute-0 sudo[106890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzikyaaygabpeyivapykysiffjgeprlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350086.740007-216-185126854579848/AnsiballZ_file.py'
Nov 28 17:14:47 compute-0 sudo[106890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:47 compute-0 python3.9[106892]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:47 compute-0 sudo[106890]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:47 compute-0 sudo[107042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bipujywxfbwwnbenboiyqjlfmujsyjvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350087.3960092-216-155827387454773/AnsiballZ_file.py'
Nov 28 17:14:47 compute-0 sudo[107042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:47 compute-0 python3.9[107044]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:47 compute-0 sudo[107042]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:48 compute-0 sudo[107194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcgidozqhteffutjhywklvpldfufjre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350087.9809768-216-167727462012364/AnsiballZ_file.py'
Nov 28 17:14:48 compute-0 sudo[107194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:48 compute-0 python3.9[107196]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:48 compute-0 sudo[107194]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:48 compute-0 sudo[107346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utjxrnhldzlxifgobquxgqzfilhrtwvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350088.641543-216-90212562119216/AnsiballZ_file.py'
Nov 28 17:14:48 compute-0 sudo[107346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:49 compute-0 python3.9[107348]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:49 compute-0 sudo[107346]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:49 compute-0 sudo[107498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzsujnrqkpwkalfemdeialoljkaqhdap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350089.2597592-216-83925021575880/AnsiballZ_file.py'
Nov 28 17:14:49 compute-0 sudo[107498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:49 compute-0 python3.9[107500]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:49 compute-0 sudo[107498]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:50 compute-0 sudo[107650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sstooitxvpchpbpowzhirejjyugtfqon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350089.9413693-316-107034828165715/AnsiballZ_file.py'
Nov 28 17:14:50 compute-0 sudo[107650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:50 compute-0 python3.9[107652]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:50 compute-0 sudo[107650]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:50 compute-0 sudo[107802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhvwnmlmhpuxwhfmadkwcgavcfnopydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350090.5670629-316-63415424959895/AnsiballZ_file.py'
Nov 28 17:14:50 compute-0 sudo[107802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:50 compute-0 python3.9[107804]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:51 compute-0 sudo[107802]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:51 compute-0 sudo[107954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxydklhiayqcgmwpftobobfvxvzfufvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350091.148525-316-119584785855504/AnsiballZ_file.py'
Nov 28 17:14:51 compute-0 sudo[107954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:51 compute-0 python3.9[107956]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:51 compute-0 sudo[107954]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:52 compute-0 sudo[108106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abbngyyibqmgbdnieiosipqvkqtxnrru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350091.7378757-316-63131174947845/AnsiballZ_file.py'
Nov 28 17:14:52 compute-0 sudo[108106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:52 compute-0 python3.9[108108]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:52 compute-0 sudo[108106]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:52 compute-0 sudo[108258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjnnjnciwxqrijpysxxqiuzbbplhczzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350092.4094594-316-204757452746968/AnsiballZ_file.py'
Nov 28 17:14:52 compute-0 sudo[108258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:52 compute-0 python3.9[108260]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:52 compute-0 sudo[108258]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:53 compute-0 sudo[108410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wssylixmlsuzuzguhagaffbmgvkgfjwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350093.0012503-316-31933703945404/AnsiballZ_file.py'
Nov 28 17:14:53 compute-0 sudo[108410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:53 compute-0 python3.9[108412]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:53 compute-0 sudo[108410]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:53 compute-0 sudo[108562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gksnhajmcnsncnranybqwgrgftuelglc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350093.6135795-316-34156764844215/AnsiballZ_file.py'
Nov 28 17:14:53 compute-0 sudo[108562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:54 compute-0 python3.9[108564]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:14:54 compute-0 sudo[108562]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:54 compute-0 sudo[108714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deiireqkgwpqfichqjpfwzrgyecohnvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350094.458502-418-46053580685788/AnsiballZ_command.py'
Nov 28 17:14:54 compute-0 sudo[108714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:54 compute-0 python3.9[108716]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:54 compute-0 sudo[108714]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:55 compute-0 python3.9[108868]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:14:56 compute-0 podman[108916]: 2025-11-28 17:14:56.231549094 +0000 UTC m=+0.058288360 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:14:56 compute-0 sudo[109037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzyebzjofaprskynmnypqxdlanhhuxjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350096.1065145-454-91449711208576/AnsiballZ_systemd_service.py'
Nov 28 17:14:56 compute-0 sudo[109037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:56 compute-0 python3.9[109039]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:14:56 compute-0 systemd[1]: Reloading.
Nov 28 17:14:56 compute-0 systemd-rc-local-generator[109065]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:14:56 compute-0 systemd-sysv-generator[109069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:14:56 compute-0 sudo[109037]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:57 compute-0 sudo[109224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcnwantvptjpksvearprbeosbyvhjeyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350097.1761155-470-211885203024921/AnsiballZ_command.py'
Nov 28 17:14:57 compute-0 sudo[109224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:57 compute-0 python3.9[109226]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:57 compute-0 sudo[109224]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:58 compute-0 sudo[109377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlwmtbprwfeowibszxjjirxmeccpnqar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350097.814613-470-142339836174263/AnsiballZ_command.py'
Nov 28 17:14:58 compute-0 sudo[109377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:58 compute-0 python3.9[109379]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:58 compute-0 sudo[109377]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:58 compute-0 sudo[109530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yznqlrslbfweljbqimlndwapcyxbixcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350098.4053435-470-223903262765928/AnsiballZ_command.py'
Nov 28 17:14:58 compute-0 sudo[109530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:58 compute-0 python3.9[109532]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:58 compute-0 sudo[109530]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:59 compute-0 sudo[109683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbtegymumqirephkeeqpuydzyudzxbzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350098.975684-470-29052769470018/AnsiballZ_command.py'
Nov 28 17:14:59 compute-0 sudo[109683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:14:59 compute-0 python3.9[109685]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:14:59 compute-0 sudo[109683]: pam_unix(sudo:session): session closed for user root
Nov 28 17:14:59 compute-0 podman[109687]: 2025-11-28 17:14:59.530349758 +0000 UTC m=+0.092260723 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 28 17:15:00 compute-0 sudo[109862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ittlmzyxuxayxctplsyhgyvxhjfeyjzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350100.4253523-470-172956639835780/AnsiballZ_command.py'
Nov 28 17:15:00 compute-0 sudo[109862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:00 compute-0 python3.9[109864]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:15:00 compute-0 sudo[109862]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:01 compute-0 sudo[110015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-konzbentwqzrtvtnqqbbpyrajaxktnuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350101.0488598-470-142104643198322/AnsiballZ_command.py'
Nov 28 17:15:01 compute-0 sudo[110015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:01 compute-0 python3.9[110017]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:15:01 compute-0 sudo[110015]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:02 compute-0 sudo[110168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlihvwsauukcqqwgocmqcmqstbnnpnii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350101.6656578-470-207661177984248/AnsiballZ_command.py'
Nov 28 17:15:02 compute-0 sudo[110168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:02 compute-0 python3.9[110170]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:15:02 compute-0 sudo[110168]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:03 compute-0 sudo[110322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwxrmshiyhwiyujvxwkdavzwayhoopgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350103.4747593-578-144957980721427/AnsiballZ_getent.py'
Nov 28 17:15:03 compute-0 sudo[110322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:04 compute-0 python3.9[110324]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 28 17:15:04 compute-0 sudo[110322]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:04 compute-0 sudo[110475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxbyrjdccfpbidffxayvvocvsetyxbbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350104.305174-594-161639598736634/AnsiballZ_group.py'
Nov 28 17:15:04 compute-0 sudo[110475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:04 compute-0 python3.9[110477]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:15:04 compute-0 groupadd[110478]: group added to /etc/group: name=libvirt, GID=42473
Nov 28 17:15:04 compute-0 groupadd[110478]: group added to /etc/gshadow: name=libvirt
Nov 28 17:15:04 compute-0 groupadd[110478]: new group: name=libvirt, GID=42473
Nov 28 17:15:04 compute-0 sudo[110475]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:05 compute-0 sudo[110633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzxydmgrebhdzzykjrrfllnppaezuhda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350105.2401376-610-113004893522462/AnsiballZ_user.py'
Nov 28 17:15:05 compute-0 sudo[110633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:05 compute-0 python3.9[110635]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 17:15:05 compute-0 useradd[110637]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 17:15:06 compute-0 sudo[110633]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:07 compute-0 sudo[110793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxadjonxgncadmiqvzxnzbcdbwsjksl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350106.9354143-632-167926728769823/AnsiballZ_setup.py'
Nov 28 17:15:07 compute-0 sudo[110793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:07 compute-0 python3.9[110795]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:15:07 compute-0 sudo[110793]: pam_unix(sudo:session): session closed for user root
Nov 28 17:15:08 compute-0 sudo[110877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsjkabbkbewbpkbvjlvetogvshogbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350106.9354143-632-167926728769823/AnsiballZ_dnf.py'
Nov 28 17:15:08 compute-0 sudo[110877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:15:08 compute-0 python3.9[110879]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:15:27 compute-0 podman[111070]: 2025-11-28 17:15:27.221768077 +0000 UTC m=+0.074591718 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:15:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:15:27.655 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:15:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:15:27.656 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:15:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:15:27.656 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:15:30 compute-0 podman[111089]: 2025-11-28 17:15:30.257870727 +0000 UTC m=+0.124650599 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 17:15:35 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:15:35 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:15:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:15:58 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 28 17:15:58 compute-0 podman[111593]: 2025-11-28 17:15:58.231326636 +0000 UTC m=+0.070152106 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 17:16:01 compute-0 podman[113377]: 2025-11-28 17:16:01.236399738 +0000 UTC m=+0.091001503 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 17:16:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:16:27.655 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:16:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:16:27.657 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:16:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:16:27.657 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:16:29 compute-0 podman[127974]: 2025-11-28 17:16:29.230682643 +0000 UTC m=+0.078549719 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 17:16:32 compute-0 podman[127993]: 2025-11-28 17:16:32.227886414 +0000 UTC m=+0.090675797 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 28 17:16:36 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 17:16:36 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 17:16:38 compute-0 groupadd[128031]: group added to /etc/group: name=dnsmasq, GID=992
Nov 28 17:16:38 compute-0 groupadd[128031]: group added to /etc/gshadow: name=dnsmasq
Nov 28 17:16:38 compute-0 groupadd[128031]: new group: name=dnsmasq, GID=992
Nov 28 17:16:38 compute-0 useradd[128038]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 28 17:16:38 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:16:38 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 28 17:16:38 compute-0 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Nov 28 17:16:39 compute-0 groupadd[128051]: group added to /etc/group: name=clevis, GID=991
Nov 28 17:16:39 compute-0 groupadd[128051]: group added to /etc/gshadow: name=clevis
Nov 28 17:16:39 compute-0 groupadd[128051]: new group: name=clevis, GID=991
Nov 28 17:16:39 compute-0 useradd[128058]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 28 17:16:39 compute-0 usermod[128068]: add 'clevis' to group 'tss'
Nov 28 17:16:39 compute-0 usermod[128068]: add 'clevis' to shadow group 'tss'
Nov 28 17:16:42 compute-0 polkitd[43837]: Reloading rules
Nov 28 17:16:42 compute-0 polkitd[43837]: Collecting garbage unconditionally...
Nov 28 17:16:42 compute-0 polkitd[43837]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 17:16:42 compute-0 polkitd[43837]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 17:16:42 compute-0 polkitd[43837]: Finished loading, compiling and executing 3 rules
Nov 28 17:16:42 compute-0 polkitd[43837]: Reloading rules
Nov 28 17:16:42 compute-0 polkitd[43837]: Collecting garbage unconditionally...
Nov 28 17:16:42 compute-0 polkitd[43837]: Loading rules from directory /etc/polkit-1/rules.d
Nov 28 17:16:42 compute-0 polkitd[43837]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 28 17:16:42 compute-0 polkitd[43837]: Finished loading, compiling and executing 3 rules
Nov 28 17:16:43 compute-0 groupadd[128255]: group added to /etc/group: name=ceph, GID=167
Nov 28 17:16:43 compute-0 groupadd[128255]: group added to /etc/gshadow: name=ceph
Nov 28 17:16:43 compute-0 groupadd[128255]: new group: name=ceph, GID=167
Nov 28 17:16:43 compute-0 useradd[128261]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 28 17:16:46 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 28 17:16:46 compute-0 sshd[1006]: Received signal 15; terminating.
Nov 28 17:16:46 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 28 17:16:46 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 28 17:16:46 compute-0 systemd[1]: sshd.service: Consumed 3.658s CPU time, read 32.0K from disk, written 96.0K to disk.
Nov 28 17:16:46 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 28 17:16:46 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 28 17:16:46 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 17:16:46 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 17:16:46 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 17:16:46 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 28 17:16:46 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 28 17:16:46 compute-0 sshd[128780]: Server listening on 0.0.0.0 port 22.
Nov 28 17:16:46 compute-0 sshd[128780]: Server listening on :: port 22.
Nov 28 17:16:46 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 28 17:16:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:16:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:16:48 compute-0 systemd[1]: Reloading.
Nov 28 17:16:48 compute-0 systemd-sysv-generator[129040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:16:48 compute-0 systemd-rc-local-generator[129036]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:16:48 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:16:52 compute-0 sudo[110877]: pam_unix(sudo:session): session closed for user root
Nov 28 17:16:56 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:16:56 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:16:56 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.158s CPU time.
Nov 28 17:16:56 compute-0 systemd[1]: run-rd408b0d40f234a8992676097bd1d921e.service: Deactivated successfully.
Nov 28 17:17:00 compute-0 podman[137436]: 2025-11-28 17:17:00.215298148 +0000 UTC m=+0.070795736 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 17:17:03 compute-0 podman[137455]: 2025-11-28 17:17:03.237642641 +0000 UTC m=+0.091755828 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 17:17:04 compute-0 sudo[137606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giseprxvgpgibdsksoxhoklyntsoqlnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350223.188696-656-106793452545803/AnsiballZ_systemd.py'
Nov 28 17:17:04 compute-0 sudo[137606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:04 compute-0 python3.9[137608]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:17:04 compute-0 systemd[1]: Reloading.
Nov 28 17:17:04 compute-0 systemd-sysv-generator[137641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:04 compute-0 systemd-rc-local-generator[137637]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:04 compute-0 sudo[137606]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:05 compute-0 sudo[137796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aouptyvndgmutvtmwraahccsfbtwcivv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350224.9685516-656-164275591943848/AnsiballZ_systemd.py'
Nov 28 17:17:05 compute-0 sudo[137796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:05 compute-0 python3.9[137798]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:17:05 compute-0 systemd[1]: Reloading.
Nov 28 17:17:05 compute-0 systemd-rc-local-generator[137829]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:05 compute-0 systemd-sysv-generator[137833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:05 compute-0 sudo[137796]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:06 compute-0 sudo[137987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtfqvhcrjwmbdjozhsgafgihusrrgqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350226.3278923-656-233730899444658/AnsiballZ_systemd.py'
Nov 28 17:17:06 compute-0 sudo[137987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:06 compute-0 python3.9[137989]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:17:07 compute-0 systemd[1]: Reloading.
Nov 28 17:17:07 compute-0 systemd-rc-local-generator[138019]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:07 compute-0 systemd-sysv-generator[138023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:07 compute-0 sudo[137987]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:08 compute-0 sudo[138177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypyafrzpzbtecrzchcsbffpjjmvosjqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350227.4943185-656-251618537823188/AnsiballZ_systemd.py'
Nov 28 17:17:08 compute-0 sudo[138177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:08 compute-0 python3.9[138179]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:17:08 compute-0 systemd[1]: Reloading.
Nov 28 17:17:08 compute-0 systemd-rc-local-generator[138208]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:08 compute-0 systemd-sysv-generator[138212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:08 compute-0 sudo[138177]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:09 compute-0 sudo[138366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyixswjtmlbdqtcjeccchzlbylquwtfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350229.5615509-714-36007135108246/AnsiballZ_systemd.py'
Nov 28 17:17:09 compute-0 sudo[138366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:10 compute-0 python3.9[138368]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:10 compute-0 systemd[1]: Reloading.
Nov 28 17:17:10 compute-0 systemd-rc-local-generator[138401]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:10 compute-0 systemd-sysv-generator[138404]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:10 compute-0 sudo[138366]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:11 compute-0 sudo[138558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzgrporxvlzdidofievfdgeonpealxwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350230.734537-714-77941372600098/AnsiballZ_systemd.py'
Nov 28 17:17:11 compute-0 sudo[138558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:11 compute-0 python3.9[138560]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:11 compute-0 systemd[1]: Reloading.
Nov 28 17:17:11 compute-0 systemd-rc-local-generator[138592]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:11 compute-0 systemd-sysv-generator[138596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:11 compute-0 sudo[138558]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:12 compute-0 sudo[138749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbkggxqtrhtitndifvulucbughcwpsgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350232.1847346-714-168173552452296/AnsiballZ_systemd.py'
Nov 28 17:17:12 compute-0 sudo[138749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:12 compute-0 python3.9[138751]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:12 compute-0 systemd[1]: Reloading.
Nov 28 17:17:12 compute-0 systemd-rc-local-generator[138784]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:12 compute-0 systemd-sysv-generator[138788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:13 compute-0 sudo[138749]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:13 compute-0 sudo[138940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlfypskfbordjqzteyfgbdjbrxcewxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350233.4067376-714-162741784093436/AnsiballZ_systemd.py'
Nov 28 17:17:13 compute-0 sudo[138940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:14 compute-0 python3.9[138942]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:14 compute-0 sudo[138940]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:14 compute-0 sudo[139095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kutaxaspdrylpjwtpiffhkmtueuezzdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350234.581246-714-188480729258305/AnsiballZ_systemd.py'
Nov 28 17:17:14 compute-0 sudo[139095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:15 compute-0 python3.9[139097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:15 compute-0 systemd[1]: Reloading.
Nov 28 17:17:15 compute-0 systemd-rc-local-generator[139125]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:15 compute-0 systemd-sysv-generator[139130]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:15 compute-0 sudo[139095]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:16 compute-0 sudo[139285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxwtoawmunrscqonalmdqcrwpnajuxkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350236.5140939-786-130018540096521/AnsiballZ_systemd.py'
Nov 28 17:17:16 compute-0 sudo[139285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:17 compute-0 python3.9[139287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 17:17:17 compute-0 systemd[1]: Reloading.
Nov 28 17:17:17 compute-0 systemd-rc-local-generator[139319]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:17:17 compute-0 systemd-sysv-generator[139323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:17:17 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 28 17:17:17 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 28 17:17:17 compute-0 sudo[139285]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:18 compute-0 sudo[139479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtnqjewsaasbfnsscvxddoxquuqqyjni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350237.857125-802-255221644284026/AnsiballZ_systemd.py'
Nov 28 17:17:18 compute-0 sudo[139479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:18 compute-0 python3.9[139481]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:18 compute-0 sudo[139479]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:19 compute-0 sudo[139634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rierowmbrpmuiyogukwbqdpsmmexpftz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350238.7409816-802-193240197320601/AnsiballZ_systemd.py'
Nov 28 17:17:19 compute-0 sudo[139634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:19 compute-0 python3.9[139636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:19 compute-0 sudo[139634]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:20 compute-0 sudo[139789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuuuuckedacgbhrfmwbazmanomvgeveg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350239.9169955-802-149687000923006/AnsiballZ_systemd.py'
Nov 28 17:17:20 compute-0 sudo[139789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:20 compute-0 python3.9[139791]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:20 compute-0 sudo[139789]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:21 compute-0 sudo[139944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvyzqejeqqxfkulyzvuyytleqztkzjmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350240.9643207-802-13267867847568/AnsiballZ_systemd.py'
Nov 28 17:17:21 compute-0 sudo[139944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:21 compute-0 python3.9[139946]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:21 compute-0 sudo[139944]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:22 compute-0 sudo[140099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfvlflrpdjkzubftvwouchbouuqgsfej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350241.781715-802-212760581946206/AnsiballZ_systemd.py'
Nov 28 17:17:22 compute-0 sudo[140099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:22 compute-0 python3.9[140101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:22 compute-0 sudo[140099]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:22 compute-0 sudo[140254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czoafeicjijhypitzxctgbhaqmhosqfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350242.5979953-802-181217041338817/AnsiballZ_systemd.py'
Nov 28 17:17:22 compute-0 sudo[140254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:23 compute-0 python3.9[140256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:23 compute-0 sudo[140254]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:23 compute-0 sudo[140409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmyxkckfxjbgujtufemwumpmhnojsdzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350243.378229-802-248158585569251/AnsiballZ_systemd.py'
Nov 28 17:17:23 compute-0 sudo[140409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:24 compute-0 python3.9[140411]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:24 compute-0 sudo[140409]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:24 compute-0 sudo[140564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbgolqbaakrrjxnamoyjjxvfihvgebag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350244.4410522-802-44595179829657/AnsiballZ_systemd.py'
Nov 28 17:17:24 compute-0 sudo[140564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:25 compute-0 python3.9[140566]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:25 compute-0 sudo[140564]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:25 compute-0 sudo[140719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvbrcdmojxloogiovmiwnpypklebembg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350245.2957506-802-276291007656303/AnsiballZ_systemd.py'
Nov 28 17:17:25 compute-0 sudo[140719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:26 compute-0 python3.9[140721]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:26 compute-0 sudo[140719]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:26 compute-0 sudo[140874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oagtqyxvttcgtfgrzdrdustwcsrizfra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350246.2964108-802-111946684216777/AnsiballZ_systemd.py'
Nov 28 17:17:26 compute-0 sudo[140874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:26 compute-0 python3.9[140876]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:26 compute-0 sudo[140874]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:27 compute-0 sudo[141029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnldvfxaeychsjwknsueokzcfbcyzuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350247.1342173-802-85737704502084/AnsiballZ_systemd.py'
Nov 28 17:17:27 compute-0 sudo[141029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:17:27.657 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:17:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:17:27.659 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:17:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:17:27.659 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:17:27 compute-0 python3.9[141031]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:27 compute-0 sudo[141029]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:28 compute-0 sudo[141184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eipyaqtcrodktybydbxzzhauykrudvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350248.1487565-802-29578584862981/AnsiballZ_systemd.py'
Nov 28 17:17:28 compute-0 sudo[141184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:28 compute-0 python3.9[141186]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:28 compute-0 sudo[141184]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:29 compute-0 sudo[141339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-colabfbdivxhiqvtuguwzaoeugcsgnwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350248.9912817-802-228704832959182/AnsiballZ_systemd.py'
Nov 28 17:17:29 compute-0 sudo[141339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:29 compute-0 python3.9[141341]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:30 compute-0 sudo[141339]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:30 compute-0 podman[141344]: 2025-11-28 17:17:30.787708249 +0000 UTC m=+0.062975660 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 17:17:31 compute-0 sudo[141514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vanmyeluvulkbpshctggcbhppzwlamqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350250.9023504-802-73616398751646/AnsiballZ_systemd.py'
Nov 28 17:17:31 compute-0 sudo[141514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:31 compute-0 python3.9[141516]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 17:17:31 compute-0 sudo[141514]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:32 compute-0 sudo[141669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krsfpptmkgijvvzcotrwuvqyzqrdtbml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350252.2623773-1006-173382674051311/AnsiballZ_file.py'
Nov 28 17:17:32 compute-0 sudo[141669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:32 compute-0 python3.9[141671]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:32 compute-0 sudo[141669]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:33 compute-0 sudo[141821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbdqdfrkajlwbtnqioygkhtsdfrsyls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350252.8697653-1006-35927765141335/AnsiballZ_file.py'
Nov 28 17:17:33 compute-0 sudo[141821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:33 compute-0 python3.9[141823]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:33 compute-0 sudo[141821]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:33 compute-0 sudo[141990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihpqlvkifvtoblgryqatptzuiubcigat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350253.5420132-1006-55191146639970/AnsiballZ_file.py'
Nov 28 17:17:33 compute-0 sudo[141990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:33 compute-0 podman[141947]: 2025-11-28 17:17:33.901579658 +0000 UTC m=+0.088330908 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 17:17:34 compute-0 python3.9[141996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:34 compute-0 sudo[141990]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:34 compute-0 sudo[142152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izwjzhbgyfmbxrwmsdeltkqvtohrqcpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350254.2095752-1006-57753635735756/AnsiballZ_file.py'
Nov 28 17:17:34 compute-0 sudo[142152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:34 compute-0 python3.9[142154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:34 compute-0 sudo[142152]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:35 compute-0 sudo[142304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boezjiyriafitunbejuvouavzedwmocm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350254.7892122-1006-41832948960109/AnsiballZ_file.py'
Nov 28 17:17:35 compute-0 sudo[142304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:35 compute-0 python3.9[142306]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:35 compute-0 sudo[142304]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:35 compute-0 sudo[142456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzbzaglrvcknvpycbtoxqpcnmpehtngj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350255.5753806-1006-1267238044062/AnsiballZ_file.py'
Nov 28 17:17:35 compute-0 sudo[142456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:36 compute-0 python3.9[142458]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:17:36 compute-0 sudo[142456]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:37 compute-0 sudo[142608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuhsutdkzzxozeohdzwjyvifrbtiljmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350256.5942125-1092-38119152125915/AnsiballZ_stat.py'
Nov 28 17:17:37 compute-0 sudo[142608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:37 compute-0 python3.9[142610]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:37 compute-0 sudo[142608]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:37 compute-0 sudo[142733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpaicljouktgjorapnlgltgifaooghiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350256.5942125-1092-38119152125915/AnsiballZ_copy.py'
Nov 28 17:17:37 compute-0 sudo[142733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:38 compute-0 python3.9[142735]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350256.5942125-1092-38119152125915/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:38 compute-0 sudo[142733]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:38 compute-0 sudo[142885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olrxbkukeaenmsnxmrsnhdyxzpxmegaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350258.2404482-1092-238950200657558/AnsiballZ_stat.py'
Nov 28 17:17:38 compute-0 sudo[142885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:38 compute-0 python3.9[142887]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:38 compute-0 sudo[142885]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:39 compute-0 sudo[143010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndjgvewgyxpxvoqydtfwvvjjcegvvnsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350258.2404482-1092-238950200657558/AnsiballZ_copy.py'
Nov 28 17:17:39 compute-0 sudo[143010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:39 compute-0 python3.9[143012]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350258.2404482-1092-238950200657558/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:39 compute-0 sudo[143010]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:40 compute-0 sudo[143162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujauwqwidyquzfablyymmfonqlerllqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350259.7967365-1092-236204908477835/AnsiballZ_stat.py'
Nov 28 17:17:40 compute-0 sudo[143162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:40 compute-0 python3.9[143164]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:40 compute-0 sudo[143162]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:40 compute-0 sudo[143287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqdugdpregqjzvlronurrqbjdnfnhzhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350259.7967365-1092-236204908477835/AnsiballZ_copy.py'
Nov 28 17:17:40 compute-0 sudo[143287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:40 compute-0 python3.9[143289]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350259.7967365-1092-236204908477835/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:40 compute-0 sudo[143287]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:41 compute-0 sudo[143439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eukmssgmctrgczsiwhmeufafofbhoeyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350261.0257814-1092-169532954235353/AnsiballZ_stat.py'
Nov 28 17:17:41 compute-0 sudo[143439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:41 compute-0 python3.9[143441]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:41 compute-0 sudo[143439]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:41 compute-0 sudo[143564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxasoxifhdiherxlqilqhtapprgdponc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350261.0257814-1092-169532954235353/AnsiballZ_copy.py'
Nov 28 17:17:41 compute-0 sudo[143564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:42 compute-0 python3.9[143566]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350261.0257814-1092-169532954235353/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:42 compute-0 sudo[143564]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:42 compute-0 sudo[143716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqbzshpuljbnvjeicftjnkdvbnjacxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350262.3064954-1092-44198837148855/AnsiballZ_stat.py'
Nov 28 17:17:42 compute-0 sudo[143716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:42 compute-0 python3.9[143718]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:42 compute-0 sudo[143716]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:43 compute-0 sudo[143841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klldvggqxeahurktczhyislneojcvlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350262.3064954-1092-44198837148855/AnsiballZ_copy.py'
Nov 28 17:17:43 compute-0 sudo[143841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:43 compute-0 python3.9[143843]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350262.3064954-1092-44198837148855/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:43 compute-0 sudo[143841]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:43 compute-0 sudo[143993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udxcpjgusvdvhumzsofipyjipsxabflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350263.675844-1092-7974374771388/AnsiballZ_stat.py'
Nov 28 17:17:43 compute-0 sudo[143993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:44 compute-0 python3.9[143995]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:44 compute-0 sudo[143993]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:44 compute-0 sudo[144118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abxkelidepjikbujmccbzsdymsnenssi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350263.675844-1092-7974374771388/AnsiballZ_copy.py'
Nov 28 17:17:44 compute-0 sudo[144118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:44 compute-0 python3.9[144120]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350263.675844-1092-7974374771388/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:44 compute-0 sudo[144118]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:45 compute-0 sudo[144270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwuollnaamdpjbovblzqrkvkpkaggnym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350264.8801713-1092-86047627576398/AnsiballZ_stat.py'
Nov 28 17:17:45 compute-0 sudo[144270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:45 compute-0 python3.9[144272]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:45 compute-0 sudo[144270]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:45 compute-0 sudo[144393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvfkaoakhravzbgzrsoxadmmjielculc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350264.8801713-1092-86047627576398/AnsiballZ_copy.py'
Nov 28 17:17:45 compute-0 sudo[144393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:45 compute-0 python3.9[144395]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350264.8801713-1092-86047627576398/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:45 compute-0 sudo[144393]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:46 compute-0 sudo[144545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uykbbbatmnddnqbhafmtekzczmkjmqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350266.0922394-1092-22537437282220/AnsiballZ_stat.py'
Nov 28 17:17:46 compute-0 sudo[144545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:46 compute-0 python3.9[144547]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:46 compute-0 sudo[144545]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:46 compute-0 sudo[144670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydukyofpgbibnrfnrkgqvsnkmykwlhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350266.0922394-1092-22537437282220/AnsiballZ_copy.py'
Nov 28 17:17:46 compute-0 sudo[144670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:47 compute-0 python3.9[144672]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764350266.0922394-1092-22537437282220/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:47 compute-0 sudo[144670]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:48 compute-0 sudo[144822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ookogmuhorbzaaovixcqxxkpgbihbbcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350268.2137415-1318-61151738640002/AnsiballZ_command.py'
Nov 28 17:17:48 compute-0 sudo[144822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:48 compute-0 python3.9[144824]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 28 17:17:48 compute-0 sudo[144822]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:49 compute-0 sudo[144975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttlhwakqabdgdcixsnysxujwwfikxosl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350269.0980978-1336-171907719566455/AnsiballZ_file.py'
Nov 28 17:17:49 compute-0 sudo[144975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:49 compute-0 python3.9[144977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:49 compute-0 sudo[144975]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:50 compute-0 sudo[145127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcmlflokfrdxeasvwvhbhnuahrmxyxgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350269.7433655-1336-247233032291376/AnsiballZ_file.py'
Nov 28 17:17:50 compute-0 sudo[145127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:50 compute-0 python3.9[145129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:50 compute-0 sudo[145127]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:50 compute-0 sudo[145279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owszfufswdjegpwnmdphrriadnmjkkcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350270.3420255-1336-59537086155905/AnsiballZ_file.py'
Nov 28 17:17:50 compute-0 sudo[145279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:50 compute-0 python3.9[145281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:50 compute-0 sudo[145279]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:51 compute-0 sudo[145431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcqhnixxtwwilpxkwszhzhrbiuoqmbjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350271.0173273-1336-32872802790976/AnsiballZ_file.py'
Nov 28 17:17:51 compute-0 sudo[145431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:51 compute-0 python3.9[145433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:51 compute-0 sudo[145431]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:51 compute-0 sudo[145583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgwovyxurusgehfmamcfzxaiytvufxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350271.6865854-1336-220089299768400/AnsiballZ_file.py'
Nov 28 17:17:51 compute-0 sudo[145583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:52 compute-0 python3.9[145585]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:52 compute-0 sudo[145583]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:52 compute-0 sudo[145735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtwtcziauaefdrdmbflnezulwqgjkgpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350272.3124366-1336-268614333180855/AnsiballZ_file.py'
Nov 28 17:17:52 compute-0 sudo[145735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:52 compute-0 python3.9[145737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:52 compute-0 sudo[145735]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:53 compute-0 sudo[145887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omymyghhkdszaelfepozguxcdtrmtvlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350272.9295368-1336-180912788475126/AnsiballZ_file.py'
Nov 28 17:17:53 compute-0 sudo[145887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:53 compute-0 python3.9[145889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:53 compute-0 sudo[145887]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:53 compute-0 sudo[146039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcgcnvtemflvmxzjoznohcbmbszfmote ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350273.5324352-1336-229188944083816/AnsiballZ_file.py'
Nov 28 17:17:53 compute-0 sudo[146039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:53 compute-0 python3.9[146041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:54 compute-0 sudo[146039]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:54 compute-0 sudo[146191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgskvltbkpthyncxjwbripvegrxfvnql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350274.1398253-1336-64831654063949/AnsiballZ_file.py'
Nov 28 17:17:54 compute-0 sudo[146191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:54 compute-0 python3.9[146193]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:54 compute-0 sudo[146191]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:54 compute-0 sudo[146343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajndbohlhfjgsqdanyaoxdjysoxqqfga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350274.7183743-1336-235827742081314/AnsiballZ_file.py'
Nov 28 17:17:54 compute-0 sudo[146343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:55 compute-0 python3.9[146345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:55 compute-0 sudo[146343]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:55 compute-0 sudo[146495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzydlduggkxajbuajaarvnqtjgguxrdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350275.3235798-1336-18229309019702/AnsiballZ_file.py'
Nov 28 17:17:55 compute-0 sudo[146495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:55 compute-0 python3.9[146497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:55 compute-0 sudo[146495]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:56 compute-0 sudo[146647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqqsrufjtyztlggavrcfsvdxdzutwgut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350275.9630268-1336-69556099863295/AnsiballZ_file.py'
Nov 28 17:17:56 compute-0 sudo[146647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:56 compute-0 python3.9[146649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:56 compute-0 sudo[146647]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:56 compute-0 sudo[146799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozoentwmutanaromcxveohjiydnugtvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350276.6393008-1336-197458724926963/AnsiballZ_file.py'
Nov 28 17:17:56 compute-0 sudo[146799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:57 compute-0 python3.9[146801]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:57 compute-0 sudo[146799]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:57 compute-0 sudo[146951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xozepvamdudoypwuujqvkihxksrjmxpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350277.265343-1336-96070584647082/AnsiballZ_file.py'
Nov 28 17:17:57 compute-0 sudo[146951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:57 compute-0 python3.9[146953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:57 compute-0 sudo[146951]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:58 compute-0 sudo[147103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjwdotiljfladaljpwqkiidftjhzdfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350278.5999806-1534-250248165987214/AnsiballZ_stat.py'
Nov 28 17:17:58 compute-0 sudo[147103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:59 compute-0 python3.9[147105]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:17:59 compute-0 sudo[147103]: pam_unix(sudo:session): session closed for user root
Nov 28 17:17:59 compute-0 sudo[147226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwtuekjjtpthlhyjquwbujmmcnazohss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350278.5999806-1534-250248165987214/AnsiballZ_copy.py'
Nov 28 17:17:59 compute-0 sudo[147226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:17:59 compute-0 python3.9[147228]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350278.5999806-1534-250248165987214/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:17:59 compute-0 sudo[147226]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:00 compute-0 sudo[147378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otrolprvulijsdtyfxbqzgtedqcelmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350279.8214023-1534-16703834735077/AnsiballZ_stat.py'
Nov 28 17:18:00 compute-0 sudo[147378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:00 compute-0 python3.9[147380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:00 compute-0 sudo[147378]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:00 compute-0 sudo[147501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irengmegemmakdaifknqdqgugvgbuipz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350279.8214023-1534-16703834735077/AnsiballZ_copy.py'
Nov 28 17:18:00 compute-0 sudo[147501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:00 compute-0 python3.9[147503]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350279.8214023-1534-16703834735077/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:00 compute-0 sudo[147501]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:01 compute-0 podman[147603]: 2025-11-28 17:18:01.198528776 +0000 UTC m=+0.049955886 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:18:01 compute-0 sudo[147672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwuflwepeckajqsqbvbaryscvzyrhpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350280.9540806-1534-180578852696001/AnsiballZ_stat.py'
Nov 28 17:18:01 compute-0 sudo[147672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:01 compute-0 python3.9[147674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:01 compute-0 sudo[147672]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:01 compute-0 sudo[147795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guovikiltcetdpmiiodcwshngxnviboi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350280.9540806-1534-180578852696001/AnsiballZ_copy.py'
Nov 28 17:18:01 compute-0 sudo[147795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:01 compute-0 python3.9[147797]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350280.9540806-1534-180578852696001/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:01 compute-0 sudo[147795]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:02 compute-0 sudo[147947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lermokpirsiccybjcxdsbxgckdbtdpuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350282.123873-1534-114636598418041/AnsiballZ_stat.py'
Nov 28 17:18:02 compute-0 sudo[147947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:02 compute-0 python3.9[147949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:02 compute-0 sudo[147947]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:02 compute-0 sudo[148070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-henzexplexhwityjynblhusrahmvuxxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350282.123873-1534-114636598418041/AnsiballZ_copy.py'
Nov 28 17:18:02 compute-0 sudo[148070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:03 compute-0 python3.9[148072]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350282.123873-1534-114636598418041/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:03 compute-0 sudo[148070]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:03 compute-0 sudo[148222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzuiqwacewrtxviiovgsriyqpskhfhid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350283.30864-1534-15005246032351/AnsiballZ_stat.py'
Nov 28 17:18:03 compute-0 sudo[148222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:03 compute-0 python3.9[148224]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:03 compute-0 sudo[148222]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:04 compute-0 sudo[148361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnsbxumohemwhgjdegprudjtaioyvaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350283.30864-1534-15005246032351/AnsiballZ_copy.py'
Nov 28 17:18:04 compute-0 sudo[148361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:04 compute-0 podman[148319]: 2025-11-28 17:18:04.231148959 +0000 UTC m=+0.092431419 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:18:04 compute-0 python3.9[148367]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350283.30864-1534-15005246032351/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:04 compute-0 sudo[148361]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:04 compute-0 sudo[148520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncdbpvhauixjmccupjcaucdbqomfswhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350284.5580974-1534-276766099223284/AnsiballZ_stat.py'
Nov 28 17:18:04 compute-0 sudo[148520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:05 compute-0 python3.9[148522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:05 compute-0 sudo[148520]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:05 compute-0 sudo[148643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybjgiyivaivrdtljswjxtvoqlyllgrtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350284.5580974-1534-276766099223284/AnsiballZ_copy.py'
Nov 28 17:18:05 compute-0 sudo[148643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:05 compute-0 python3.9[148645]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350284.5580974-1534-276766099223284/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:05 compute-0 sudo[148643]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:05 compute-0 sudo[148795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdgrdhgxosldeuorvozmglxarhqbptng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350285.6830776-1534-273822568312081/AnsiballZ_stat.py'
Nov 28 17:18:05 compute-0 sudo[148795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:06 compute-0 python3.9[148797]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:06 compute-0 sudo[148795]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:06 compute-0 sudo[148918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vluydgtbyysahtjkempepmxumdustgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350285.6830776-1534-273822568312081/AnsiballZ_copy.py'
Nov 28 17:18:06 compute-0 sudo[148918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:07 compute-0 python3.9[148920]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350285.6830776-1534-273822568312081/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:07 compute-0 sudo[148918]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:07 compute-0 sudo[149070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajigaxxewirzdnzelmxdpqzhgvahzjpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350287.2843256-1534-165379202842348/AnsiballZ_stat.py'
Nov 28 17:18:07 compute-0 sudo[149070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:07 compute-0 python3.9[149072]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:07 compute-0 sudo[149070]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:08 compute-0 sudo[149193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikbidnvmffxtlrmqmugdceyfeqnungyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350287.2843256-1534-165379202842348/AnsiballZ_copy.py'
Nov 28 17:18:08 compute-0 sudo[149193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:08 compute-0 python3.9[149195]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350287.2843256-1534-165379202842348/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:08 compute-0 sudo[149193]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:09 compute-0 sudo[149345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgpiheqhsxusvsbpijtobbdtaifobixp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350289.0186505-1534-83377947960376/AnsiballZ_stat.py'
Nov 28 17:18:09 compute-0 sudo[149345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:09 compute-0 python3.9[149347]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:09 compute-0 sudo[149345]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:09 compute-0 sudo[149468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdbjuymfjpurckryqnzndbtlirhcjdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350289.0186505-1534-83377947960376/AnsiballZ_copy.py'
Nov 28 17:18:09 compute-0 sudo[149468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:10 compute-0 python3.9[149470]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350289.0186505-1534-83377947960376/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:10 compute-0 sudo[149468]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:10 compute-0 sudo[149620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugobedyoicuaexafmtmaqzbnmgghvcla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350290.2492783-1534-160826520346981/AnsiballZ_stat.py'
Nov 28 17:18:10 compute-0 sudo[149620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:10 compute-0 python3.9[149622]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:10 compute-0 sudo[149620]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:11 compute-0 sudo[149743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibfnjxfarnsjtgtobcvqvvovedurbpyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350290.2492783-1534-160826520346981/AnsiballZ_copy.py'
Nov 28 17:18:11 compute-0 sudo[149743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:11 compute-0 python3.9[149745]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350290.2492783-1534-160826520346981/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:11 compute-0 sudo[149743]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:11 compute-0 sudo[149895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaieswynjkdopanmuhaaeosxhtmkmblu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350291.5287588-1534-187429280767391/AnsiballZ_stat.py'
Nov 28 17:18:11 compute-0 sudo[149895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:11 compute-0 python3.9[149897]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:11 compute-0 sudo[149895]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:12 compute-0 sudo[150018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypdtgvklvnnlaityizmdxtmqbmdqvgxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350291.5287588-1534-187429280767391/AnsiballZ_copy.py'
Nov 28 17:18:12 compute-0 sudo[150018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:12 compute-0 python3.9[150020]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350291.5287588-1534-187429280767391/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:12 compute-0 sudo[150018]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:13 compute-0 sudo[150170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwbwkdsxzpwyubdqjtyufazvuizjtncy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350292.723089-1534-116225442348873/AnsiballZ_stat.py'
Nov 28 17:18:13 compute-0 sudo[150170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:13 compute-0 python3.9[150172]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:13 compute-0 sudo[150170]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:13 compute-0 sudo[150293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpdkmymvokhfmktyshuvnbobashorpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350292.723089-1534-116225442348873/AnsiballZ_copy.py'
Nov 28 17:18:13 compute-0 sudo[150293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:13 compute-0 python3.9[150295]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350292.723089-1534-116225442348873/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:13 compute-0 sudo[150293]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:14 compute-0 sudo[150445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwosaoisnfhzvznwrqgjwtgdhftcihbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350293.909574-1534-194314500221325/AnsiballZ_stat.py'
Nov 28 17:18:14 compute-0 sudo[150445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:14 compute-0 python3.9[150447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:14 compute-0 sudo[150445]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:14 compute-0 sudo[150568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chkjsmtctkubrcvbuhrkwvwnddfjmrmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350293.909574-1534-194314500221325/AnsiballZ_copy.py'
Nov 28 17:18:14 compute-0 sudo[150568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:14 compute-0 python3.9[150570]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350293.909574-1534-194314500221325/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:14 compute-0 sudo[150568]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:15 compute-0 sudo[150720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prlvcglayjpqkosikfiurxtahcssufov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350295.0895193-1534-92277211134549/AnsiballZ_stat.py'
Nov 28 17:18:15 compute-0 sudo[150720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:15 compute-0 python3.9[150722]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:15 compute-0 sudo[150720]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:15 compute-0 sudo[150843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjanptzxetaxjtqnnecbpxmcjynjztas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350295.0895193-1534-92277211134549/AnsiballZ_copy.py'
Nov 28 17:18:15 compute-0 sudo[150843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:16 compute-0 python3.9[150845]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350295.0895193-1534-92277211134549/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:16 compute-0 sudo[150843]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:17 compute-0 python3.9[150995]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:18:18 compute-0 sudo[151148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqzpplmwdomqocpgshoyxuhnvyklfirp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350298.1104515-1946-222196005662192/AnsiballZ_seboolean.py'
Nov 28 17:18:18 compute-0 sudo[151148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:18 compute-0 python3.9[151150]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 17:18:20 compute-0 sudo[151148]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:20 compute-0 sudo[151304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjftghwdrqrosfyrleobxrtdxyzjinal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350300.5598943-1962-136376323250248/AnsiballZ_copy.py'
Nov 28 17:18:20 compute-0 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 28 17:18:20 compute-0 sudo[151304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:21 compute-0 python3.9[151306]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:21 compute-0 sudo[151304]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:21 compute-0 sudo[151456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkijzryblofshnuwsjaacdnvgqpiqooj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350301.167491-1962-176141100320535/AnsiballZ_copy.py'
Nov 28 17:18:21 compute-0 sudo[151456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:21 compute-0 python3.9[151458]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:21 compute-0 sudo[151456]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:22 compute-0 sudo[151608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szruykkxcgdehsuuyjdciczrthfcvmge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350301.8281617-1962-268715743334266/AnsiballZ_copy.py'
Nov 28 17:18:22 compute-0 sudo[151608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:22 compute-0 python3.9[151610]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:22 compute-0 sudo[151608]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:22 compute-0 sudo[151760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzlfjyimfugntglajrbmprljbhgvndfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350302.4636397-1962-246699626411332/AnsiballZ_copy.py'
Nov 28 17:18:22 compute-0 sudo[151760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:22 compute-0 python3.9[151762]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:22 compute-0 sudo[151760]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:23 compute-0 sudo[151912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcxrgyhgrsgnnuaozkyjxgktecpwfgai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350303.117324-1962-179248921811423/AnsiballZ_copy.py'
Nov 28 17:18:23 compute-0 sudo[151912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:23 compute-0 python3.9[151914]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:23 compute-0 sudo[151912]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:24 compute-0 sudo[152064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msuglxgspemitmjvdtzazbxttsobznuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350304.0365176-2034-232669879430265/AnsiballZ_copy.py'
Nov 28 17:18:24 compute-0 sudo[152064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:24 compute-0 python3.9[152066]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:24 compute-0 sudo[152064]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:25 compute-0 sudo[152216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tivbridolhrtekabjlldmmurmlvppvyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350304.7142513-2034-263116962116130/AnsiballZ_copy.py'
Nov 28 17:18:25 compute-0 sudo[152216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:25 compute-0 python3.9[152218]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:25 compute-0 sudo[152216]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:26 compute-0 sudo[152368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qreanjdzrywyurhtqxilvffqgvmvwlpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350305.3653147-2034-101655281345547/AnsiballZ_copy.py'
Nov 28 17:18:26 compute-0 sudo[152368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:26 compute-0 python3.9[152370]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:26 compute-0 sudo[152368]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:26 compute-0 sudo[152520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykavponlkidrtejegoeraoecujzqiqaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350306.3990362-2034-72557586674131/AnsiballZ_copy.py'
Nov 28 17:18:26 compute-0 sudo[152520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:26 compute-0 python3.9[152522]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:26 compute-0 sudo[152520]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:27 compute-0 sudo[152672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnizjwkdwigmoclkjjwhehfwcrcdwrgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350307.0702033-2034-104627402568226/AnsiballZ_copy.py'
Nov 28 17:18:27 compute-0 sudo[152672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:27 compute-0 python3.9[152674]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:27 compute-0 sudo[152672]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:18:27.659 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:18:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:18:27.661 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:18:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:18:27.661 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:18:28 compute-0 sudo[152824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lysrufjvhtctapcvctificisilxnkvwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350307.8899677-2106-198092055124999/AnsiballZ_systemd.py'
Nov 28 17:18:28 compute-0 sudo[152824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:28 compute-0 python3.9[152826]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:18:28 compute-0 systemd[1]: Reloading.
Nov 28 17:18:28 compute-0 systemd-rc-local-generator[152854]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:18:28 compute-0 systemd-sysv-generator[152858]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:18:28 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 28 17:18:28 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 28 17:18:28 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 28 17:18:28 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 28 17:18:28 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 28 17:18:28 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 28 17:18:28 compute-0 sudo[152824]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:30 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 17:18:30 compute-0 sudo[153018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhfuruptunucfypuyxsqffpvzqedbocc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350309.7886033-2106-92358547289880/AnsiballZ_systemd.py'
Nov 28 17:18:30 compute-0 sudo[153018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:30 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 17:18:30 compute-0 python3.9[153020]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:18:30 compute-0 systemd[1]: Reloading.
Nov 28 17:18:30 compute-0 systemd-rc-local-generator[153049]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:18:30 compute-0 systemd-sysv-generator[153052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:18:30 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 28 17:18:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 28 17:18:30 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 28 17:18:30 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 28 17:18:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 28 17:18:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 28 17:18:30 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 17:18:30 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 28 17:18:30 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 28 17:18:30 compute-0 sudo[153018]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:30 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 28 17:18:31 compute-0 sudo[153256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwmfstcrzkwjuzifefgvpqmgbiwizjsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350310.971847-2106-91306786049695/AnsiballZ_systemd.py'
Nov 28 17:18:31 compute-0 sudo[153256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:31 compute-0 podman[153218]: 2025-11-28 17:18:31.3522288 +0000 UTC m=+0.057516275 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 17:18:31 compute-0 python3.9[153264]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:18:31 compute-0 systemd[1]: Reloading.
Nov 28 17:18:31 compute-0 systemd-sysv-generator[153297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:18:31 compute-0 systemd-rc-local-generator[153294]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:18:31 compute-0 setroubleshoot[152994]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1ffb3c62-c131-4ee9-a3ba-9e83a88b61e2
Nov 28 17:18:31 compute-0 setroubleshoot[152994]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 28 17:18:31 compute-0 setroubleshoot[152994]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 1ffb3c62-c131-4ee9-a3ba-9e83a88b61e2
Nov 28 17:18:31 compute-0 setroubleshoot[152994]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 28 17:18:32 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 28 17:18:32 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 28 17:18:32 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 28 17:18:32 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 28 17:18:32 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:18:32 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:18:32 compute-0 sudo[153256]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:32 compute-0 sudo[153475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szviqvfaizjhocgoyvwrulsjdadquats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350312.2929678-2106-278105537432483/AnsiballZ_systemd.py'
Nov 28 17:18:32 compute-0 sudo[153475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:32 compute-0 python3.9[153477]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:18:32 compute-0 systemd[1]: Reloading.
Nov 28 17:18:32 compute-0 systemd-rc-local-generator[153507]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:18:32 compute-0 systemd-sysv-generator[153511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:18:33 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 28 17:18:33 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 28 17:18:33 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 28 17:18:33 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 28 17:18:33 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 28 17:18:33 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 28 17:18:33 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 28 17:18:33 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 28 17:18:33 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 28 17:18:33 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 28 17:18:33 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 17:18:33 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 28 17:18:33 compute-0 sudo[153475]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:34 compute-0 sudo[153690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-butaqtpivuosxvfzwdgpndddcteamffm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350313.8599226-2106-162466373111903/AnsiballZ_systemd.py'
Nov 28 17:18:34 compute-0 sudo[153690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:34 compute-0 python3.9[153692]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:18:34 compute-0 systemd[1]: Reloading.
Nov 28 17:18:34 compute-0 systemd-sysv-generator[153739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:18:34 compute-0 systemd-rc-local-generator[153736]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:18:34 compute-0 podman[153694]: 2025-11-28 17:18:34.654071862 +0000 UTC m=+0.111394251 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:18:34 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 28 17:18:34 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 28 17:18:34 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 28 17:18:34 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 28 17:18:34 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 28 17:18:34 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 28 17:18:34 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 28 17:18:34 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 28 17:18:34 compute-0 sudo[153690]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:35 compute-0 sudo[153926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcgaspblqsntzcyxlskspliqmpyjaxsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350315.235119-2180-119072594868710/AnsiballZ_file.py'
Nov 28 17:18:35 compute-0 sudo[153926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:35 compute-0 python3.9[153928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:35 compute-0 sudo[153926]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:36 compute-0 sudo[154078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqdvagcrwyeckniuptehpfbhvxqnlgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350316.014618-2196-211806975147053/AnsiballZ_find.py'
Nov 28 17:18:36 compute-0 sudo[154078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:36 compute-0 python3.9[154080]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:18:36 compute-0 sudo[154078]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:38 compute-0 sudo[154230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajbreroxsicqkrmjqncemmyfvajlmzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350317.2967622-2224-10758592205051/AnsiballZ_stat.py'
Nov 28 17:18:38 compute-0 sudo[154230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:38 compute-0 python3.9[154232]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:38 compute-0 sudo[154230]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:39 compute-0 sudo[154353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgdnwbredqyhtwrepsdueetwtdtzqrgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350317.2967622-2224-10758592205051/AnsiballZ_copy.py'
Nov 28 17:18:39 compute-0 sudo[154353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:39 compute-0 python3.9[154355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350317.2967622-2224-10758592205051/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:39 compute-0 sudo[154353]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:40 compute-0 sudo[154505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hffeexjyjlaxfbhemkhxucvebngwzrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350319.7456956-2256-173945634901749/AnsiballZ_file.py'
Nov 28 17:18:40 compute-0 sudo[154505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:40 compute-0 python3.9[154507]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:40 compute-0 sudo[154505]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:41 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 28 17:18:41 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 17:18:44 compute-0 sudo[154657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvzyxjyktjwjrleluwucwfywlwwhquek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350320.7464418-2272-167131301749045/AnsiballZ_stat.py'
Nov 28 17:18:44 compute-0 sudo[154657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:44 compute-0 python3.9[154659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:44 compute-0 sudo[154657]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:45 compute-0 sudo[154735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rffvyalvgpfdcjnwirfdvjhatndosuuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350320.7464418-2272-167131301749045/AnsiballZ_file.py'
Nov 28 17:18:45 compute-0 sudo[154735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:45 compute-0 python3.9[154737]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:45 compute-0 sudo[154735]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:46 compute-0 sudo[154887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbdwbueuqulvlvqommqxbwhcppwaulyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350325.7390058-2296-82452408227256/AnsiballZ_stat.py'
Nov 28 17:18:46 compute-0 sudo[154887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:46 compute-0 python3.9[154889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:46 compute-0 sudo[154887]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:46 compute-0 sudo[154965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnlbfjymxefdtpburbodzzbdmbeahskd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350325.7390058-2296-82452408227256/AnsiballZ_file.py'
Nov 28 17:18:46 compute-0 sudo[154965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:46 compute-0 python3.9[154967]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.si23i6p4 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:46 compute-0 sudo[154965]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:47 compute-0 sudo[155117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npowqugmlnbhxthkkhtisujcplpvkvbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350327.6473043-2320-100054779848772/AnsiballZ_stat.py'
Nov 28 17:18:48 compute-0 sudo[155117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:48 compute-0 python3.9[155119]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:48 compute-0 sudo[155117]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:48 compute-0 sudo[155195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrkfncllmgbtgpbnxszhykvybrfcpxhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350327.6473043-2320-100054779848772/AnsiballZ_file.py'
Nov 28 17:18:48 compute-0 sudo[155195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:48 compute-0 python3.9[155197]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:48 compute-0 sudo[155195]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:49 compute-0 sudo[155347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoqmdkycpwolkmcizuefarmiyhhftujn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350328.9547153-2346-85605852997480/AnsiballZ_command.py'
Nov 28 17:18:49 compute-0 sudo[155347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:49 compute-0 python3.9[155349]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:18:49 compute-0 sudo[155347]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:50 compute-0 sudo[155500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkvrisnqszeahbtvujpywofgigjsqzyg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350330.2261887-2362-124638949382155/AnsiballZ_edpm_nftables_from_files.py'
Nov 28 17:18:50 compute-0 sudo[155500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:50 compute-0 python3[155502]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 17:18:50 compute-0 sudo[155500]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:51 compute-0 sudo[155652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtkbncdugmhaamnviuzcxaiqpjcgiobm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350331.2714345-2378-149353416868278/AnsiballZ_stat.py'
Nov 28 17:18:51 compute-0 sudo[155652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:51 compute-0 python3.9[155654]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:51 compute-0 sudo[155652]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:52 compute-0 sudo[155730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tphjpdefkwmaacvgjuykfakfsczypbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350331.2714345-2378-149353416868278/AnsiballZ_file.py'
Nov 28 17:18:52 compute-0 sudo[155730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:52 compute-0 python3.9[155732]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:52 compute-0 sudo[155730]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:52 compute-0 sudo[155882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klnaglztzukvmtjuwvtwipjqkjmcuzdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350332.599045-2402-173389207048037/AnsiballZ_stat.py'
Nov 28 17:18:52 compute-0 sudo[155882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:53 compute-0 python3.9[155884]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:53 compute-0 sudo[155882]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:53 compute-0 sudo[155960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmaazlvkhrrfwwpfzfqmmkjwltbumueq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350332.599045-2402-173389207048037/AnsiballZ_file.py'
Nov 28 17:18:53 compute-0 sudo[155960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:53 compute-0 python3.9[155962]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:53 compute-0 sudo[155960]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:54 compute-0 sudo[156112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjcfthyrogcszoabnjjufyuamuaxxiuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350333.8013072-2426-173243777022830/AnsiballZ_stat.py'
Nov 28 17:18:54 compute-0 sudo[156112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:54 compute-0 python3.9[156114]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:54 compute-0 sudo[156112]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:54 compute-0 sudo[156190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvatarazfdfulwmdxbkbnnwnwrbmylu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350333.8013072-2426-173243777022830/AnsiballZ_file.py'
Nov 28 17:18:54 compute-0 sudo[156190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:54 compute-0 python3.9[156192]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:54 compute-0 sudo[156190]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:55 compute-0 sudo[156342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnbrginnifllszmyxpwbtrrepqojqisa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350335.1406376-2450-239790925911112/AnsiballZ_stat.py'
Nov 28 17:18:55 compute-0 sudo[156342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:55 compute-0 python3.9[156344]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:55 compute-0 sudo[156342]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:55 compute-0 sudo[156420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlspethwrkzqabyfhxwfumeozmbflamt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350335.1406376-2450-239790925911112/AnsiballZ_file.py'
Nov 28 17:18:55 compute-0 sudo[156420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:56 compute-0 python3.9[156422]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:56 compute-0 sudo[156420]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:56 compute-0 sudo[156572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivlszkrnwkdsxavgftgzuuocvjrgbobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350336.4104567-2474-280854434017752/AnsiballZ_stat.py'
Nov 28 17:18:56 compute-0 sudo[156572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:56 compute-0 python3.9[156574]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:18:57 compute-0 sudo[156572]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:57 compute-0 sudo[156697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpsyqxvnmhydjjuytbxwkwizogwczpco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350336.4104567-2474-280854434017752/AnsiballZ_copy.py'
Nov 28 17:18:57 compute-0 sudo[156697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:57 compute-0 python3.9[156699]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350336.4104567-2474-280854434017752/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:57 compute-0 sudo[156697]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:58 compute-0 sudo[156849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iupzziwnrexqvnjsiswwvmfcbwsnmxsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350337.8749485-2504-263828880314573/AnsiballZ_file.py'
Nov 28 17:18:58 compute-0 sudo[156849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:58 compute-0 python3.9[156851]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:58 compute-0 sudo[156849]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:58 compute-0 sudo[157001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-surekcbcmwcjiqxyopxrootwzyisbnwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350338.5636814-2520-116729460409105/AnsiballZ_command.py'
Nov 28 17:18:58 compute-0 sudo[157001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:59 compute-0 python3.9[157003]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:18:59 compute-0 sudo[157001]: pam_unix(sudo:session): session closed for user root
Nov 28 17:18:59 compute-0 sudo[157156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffmxytfgxshnhisrzhhxozdmaivyntbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350339.3320873-2536-240943204316861/AnsiballZ_blockinfile.py'
Nov 28 17:18:59 compute-0 sudo[157156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:18:59 compute-0 python3.9[157158]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:18:59 compute-0 sudo[157156]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:00 compute-0 sudo[157308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awwrjjkijuntedkwxhsyqupeaqjpjxno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350340.3469026-2554-37051641011566/AnsiballZ_command.py'
Nov 28 17:19:00 compute-0 sudo[157308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:00 compute-0 python3.9[157310]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:19:01 compute-0 sudo[157308]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:01 compute-0 sudo[157471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfclvcehtypwpiqavenvecptdqbzmkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350341.2426562-2570-199001698162280/AnsiballZ_stat.py'
Nov 28 17:19:01 compute-0 sudo[157471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:01 compute-0 podman[157435]: 2025-11-28 17:19:01.57980045 +0000 UTC m=+0.057277978 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:19:01 compute-0 python3.9[157481]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:19:01 compute-0 sudo[157471]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:02 compute-0 sudo[157634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaptfsechmiebokvutquwddwvonnupas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350342.021276-2586-61358149946703/AnsiballZ_command.py'
Nov 28 17:19:02 compute-0 sudo[157634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:02 compute-0 python3.9[157636]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:19:02 compute-0 sudo[157634]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:03 compute-0 sudo[157789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfcgavlxgfjfpfhupdiqhauvprtcqpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350342.840899-2602-46167232193589/AnsiballZ_file.py'
Nov 28 17:19:03 compute-0 sudo[157789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:03 compute-0 python3.9[157791]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:03 compute-0 sudo[157789]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:03 compute-0 sudo[157941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jowqhqsmatukdflyonpnowmzaljdjoxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350343.576066-2618-112446150977132/AnsiballZ_stat.py'
Nov 28 17:19:03 compute-0 sudo[157941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:04 compute-0 python3.9[157943]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:04 compute-0 sudo[157941]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:04 compute-0 sudo[158064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myibtqpscbtglalecdenuxxgczlostgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350343.576066-2618-112446150977132/AnsiballZ_copy.py'
Nov 28 17:19:04 compute-0 sudo[158064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:04 compute-0 python3.9[158066]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350343.576066-2618-112446150977132/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:04 compute-0 sudo[158064]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:05 compute-0 sudo[158240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slohqksszxubylmkolqcnefbbhggvphd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350344.8807044-2648-36120975235179/AnsiballZ_stat.py'
Nov 28 17:19:05 compute-0 sudo[158240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:05 compute-0 podman[158166]: 2025-11-28 17:19:05.235148536 +0000 UTC m=+0.101962950 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ovn_controller)
Nov 28 17:19:05 compute-0 python3.9[158242]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:05 compute-0 sudo[158240]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:05 compute-0 sudo[158363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srutljkcczucduivlcsgjecxrnjypher ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350344.8807044-2648-36120975235179/AnsiballZ_copy.py'
Nov 28 17:19:05 compute-0 sudo[158363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:06 compute-0 python3.9[158365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350344.8807044-2648-36120975235179/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:06 compute-0 sudo[158363]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:06 compute-0 sudo[158515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lknqklrudhzhusjojcthcgpwfwgarzph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350346.4518907-2678-244744036021497/AnsiballZ_stat.py'
Nov 28 17:19:06 compute-0 sudo[158515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:06 compute-0 python3.9[158517]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:06 compute-0 sudo[158515]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:07 compute-0 sudo[158638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzrojczvngvloyldicbhfaekzwbatcla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350346.4518907-2678-244744036021497/AnsiballZ_copy.py'
Nov 28 17:19:07 compute-0 sudo[158638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:07 compute-0 python3.9[158640]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350346.4518907-2678-244744036021497/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:07 compute-0 sudo[158638]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:08 compute-0 sudo[158790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhsjlsddrrzygbhgcdugzfohwiwalpjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350347.9235582-2708-136224322077858/AnsiballZ_systemd.py'
Nov 28 17:19:08 compute-0 sudo[158790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:08 compute-0 python3.9[158792]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:19:08 compute-0 systemd[1]: Reloading.
Nov 28 17:19:08 compute-0 systemd-rc-local-generator[158821]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:19:08 compute-0 systemd-sysv-generator[158825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:19:08 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 28 17:19:08 compute-0 sudo[158790]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:09 compute-0 sudo[158982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmvwubvajrtqsprgrujpjgasstzslkjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350349.123835-2724-171871153815440/AnsiballZ_systemd.py'
Nov 28 17:19:09 compute-0 sudo[158982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:09 compute-0 python3.9[158984]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 17:19:09 compute-0 systemd[1]: Reloading.
Nov 28 17:19:09 compute-0 systemd-rc-local-generator[159012]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:19:09 compute-0 systemd-sysv-generator[159016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:19:10 compute-0 systemd[1]: Reloading.
Nov 28 17:19:10 compute-0 systemd-sysv-generator[159054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:19:10 compute-0 systemd-rc-local-generator[159050]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:19:10 compute-0 sudo[158982]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:10 compute-0 sshd-session[104581]: Connection closed by 192.168.122.30 port 44296
Nov 28 17:19:10 compute-0 sshd-session[104578]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:19:10 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 28 17:19:10 compute-0 systemd[1]: session-23.scope: Consumed 3min 16.077s CPU time.
Nov 28 17:19:10 compute-0 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Nov 28 17:19:10 compute-0 systemd-logind[788]: Removed session 23.
Nov 28 17:19:17 compute-0 sshd-session[159082]: Accepted publickey for zuul from 192.168.122.30 port 42910 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:19:17 compute-0 systemd-logind[788]: New session 24 of user zuul.
Nov 28 17:19:17 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 28 17:19:17 compute-0 sshd-session[159082]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:19:18 compute-0 python3.9[159235]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:19:19 compute-0 python3.9[159389]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:19:19 compute-0 network[159406]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:19:19 compute-0 network[159407]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:19:19 compute-0 network[159408]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:19:26 compute-0 sudo[159677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqqvrsvpbntttfhapsxxqheuhgeeozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350366.2032225-79-101563376036876/AnsiballZ_setup.py'
Nov 28 17:19:26 compute-0 sudo[159677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:26 compute-0 python3.9[159679]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 17:19:27 compute-0 sudo[159677]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:27 compute-0 sudo[159761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bckquydwuqlvysjejpudlcwzubrrtqvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350366.2032225-79-101563376036876/AnsiballZ_dnf.py'
Nov 28 17:19:27 compute-0 sudo[159761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:19:27.660 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:19:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:19:27.662 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:19:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:19:27.662 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:19:28 compute-0 python3.9[159763]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:19:32 compute-0 podman[159765]: 2025-11-28 17:19:32.216309665 +0000 UTC m=+0.071400059 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 17:19:33 compute-0 sudo[159761]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:34 compute-0 sudo[159933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yscudhlejohofsszlnkkthzxlimgdzim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350373.5747364-103-116081095905131/AnsiballZ_stat.py'
Nov 28 17:19:34 compute-0 sudo[159933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:34 compute-0 python3.9[159935]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:19:34 compute-0 sudo[159933]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:34 compute-0 sudo[160085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evziztaqhlbnzpypgmuxsseuggiarrug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350374.5502615-123-72179992305569/AnsiballZ_command.py'
Nov 28 17:19:34 compute-0 sudo[160085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:35 compute-0 python3.9[160087]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:19:35 compute-0 sudo[160085]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:35 compute-0 sudo[160253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcqrtncaafxfsfnzobjwsytondxkkfjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350375.5943909-143-226566933900382/AnsiballZ_stat.py'
Nov 28 17:19:35 compute-0 sudo[160253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:35 compute-0 podman[160212]: 2025-11-28 17:19:35.934880363 +0000 UTC m=+0.104822632 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:19:36 compute-0 python3.9[160259]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:19:36 compute-0 sudo[160253]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:36 compute-0 sudo[160416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuhqlzjhhplgigzwgsvknfbxczvvegok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350376.244944-159-245567342451940/AnsiballZ_command.py'
Nov 28 17:19:36 compute-0 sudo[160416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:36 compute-0 python3.9[160418]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:19:36 compute-0 sudo[160416]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:37 compute-0 sudo[160569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paksonwcjyukkxxzwsoiaarqwvgcxbyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350376.9776928-175-8357358883356/AnsiballZ_stat.py'
Nov 28 17:19:37 compute-0 sudo[160569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:37 compute-0 python3.9[160571]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:37 compute-0 sudo[160569]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:38 compute-0 sudo[160692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkvmuwjlgiuvjtcxtndjocsqjwbwsobu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350376.9776928-175-8357358883356/AnsiballZ_copy.py'
Nov 28 17:19:38 compute-0 sudo[160692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:38 compute-0 python3.9[160694]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350376.9776928-175-8357358883356/.source.iscsi _original_basename=.0bq0axxn follow=False checksum=bd6383ec5c92a39d34c193a0920df212ba66e1c9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:38 compute-0 sudo[160692]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:38 compute-0 sudo[160844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ineyhyfrhlqjijkrohakthphtumykepj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350378.4855998-205-4021936356915/AnsiballZ_file.py'
Nov 28 17:19:38 compute-0 sudo[160844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:39 compute-0 python3.9[160846]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:39 compute-0 sudo[160844]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:39 compute-0 sudo[160996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfcrbdqgbeknxqdxihutrnuhtkxjkwir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350379.4670103-221-229985173078286/AnsiballZ_lineinfile.py'
Nov 28 17:19:39 compute-0 sudo[160996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:40 compute-0 python3.9[160998]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:40 compute-0 sudo[160996]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:19:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:19:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:19:41 compute-0 sudo[161149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmtwlnksrbhrrhddkmuflrqpgythteva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350380.8088856-239-275563800576352/AnsiballZ_systemd_service.py'
Nov 28 17:19:41 compute-0 sudo[161149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:41 compute-0 python3.9[161151]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:19:41 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 28 17:19:42 compute-0 sudo[161149]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:43 compute-0 sudo[161305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibagvnpvjkmylwihzjvqpbxjvojzokqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350382.934239-255-185270767635831/AnsiballZ_systemd_service.py'
Nov 28 17:19:43 compute-0 sudo[161305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:43 compute-0 python3.9[161307]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:19:43 compute-0 systemd[1]: Reloading.
Nov 28 17:19:43 compute-0 systemd-sysv-generator[161342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:19:43 compute-0 systemd-rc-local-generator[161339]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:19:43 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 17:19:43 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 28 17:19:44 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 28 17:19:44 compute-0 systemd[1]: Started Open-iSCSI.
Nov 28 17:19:44 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 28 17:19:44 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 28 17:19:44 compute-0 sudo[161305]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:44 compute-0 sudo[161507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszsrrlpneofphdnlynnstzchafljjjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350384.6754704-277-246179020988092/AnsiballZ_service_facts.py'
Nov 28 17:19:45 compute-0 sudo[161507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:45 compute-0 python3.9[161509]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:19:45 compute-0 network[161526]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:19:45 compute-0 network[161527]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:19:45 compute-0 network[161528]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:19:50 compute-0 sudo[161507]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:50 compute-0 sudo[161797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boxlshtevewuceuwqczbabowiouqiset ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350390.698007-297-51510946057384/AnsiballZ_file.py'
Nov 28 17:19:50 compute-0 sudo[161797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:51 compute-0 python3.9[161799]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 17:19:51 compute-0 sudo[161797]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:52 compute-0 sudo[161949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabdrfgewyjlwdoblhblvncbanrxfzra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350391.490707-313-167672134871005/AnsiballZ_modprobe.py'
Nov 28 17:19:52 compute-0 sudo[161949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:52 compute-0 python3.9[161951]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 17:19:52 compute-0 sudo[161949]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:52 compute-0 sudo[162105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbllbhijsssvuykqqswqgfqivedmggtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350392.5528302-329-62386658990758/AnsiballZ_stat.py'
Nov 28 17:19:52 compute-0 sudo[162105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:53 compute-0 python3.9[162107]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:53 compute-0 sudo[162105]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:53 compute-0 sudo[162228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oikstipmdktosnoqdctmkznmbszrsxhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350392.5528302-329-62386658990758/AnsiballZ_copy.py'
Nov 28 17:19:53 compute-0 sudo[162228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:53 compute-0 python3.9[162230]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350392.5528302-329-62386658990758/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:53 compute-0 sudo[162228]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:54 compute-0 sudo[162380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-banmruacobaojqhjtiekirvitixsrykq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350394.1134455-361-96484688739152/AnsiballZ_lineinfile.py'
Nov 28 17:19:54 compute-0 sudo[162380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:54 compute-0 python3.9[162382]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:54 compute-0 sudo[162380]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:55 compute-0 sudo[162532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwhkrfcwogbhyhznqfaytmyccmxphlcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350394.7967331-377-269488510381160/AnsiballZ_systemd.py'
Nov 28 17:19:55 compute-0 sudo[162532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:55 compute-0 python3.9[162534]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:19:55 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 17:19:55 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 28 17:19:55 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 28 17:19:55 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 28 17:19:55 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 28 17:19:55 compute-0 sudo[162532]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:56 compute-0 sudo[162688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icublzruzetltxzcyfzlkucjztvuqrbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350395.969873-393-222641544357624/AnsiballZ_file.py'
Nov 28 17:19:56 compute-0 sudo[162688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:56 compute-0 python3.9[162690]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:19:56 compute-0 sudo[162688]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:57 compute-0 sudo[162840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kolktlqdtlptleaaoawpzqjhxqlaesft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350396.7933297-411-33944579917674/AnsiballZ_stat.py'
Nov 28 17:19:57 compute-0 sudo[162840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:57 compute-0 python3.9[162842]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:19:57 compute-0 sudo[162840]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:57 compute-0 sudo[162992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqlccrytsrttmjdbjqkvxoixkylgyomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350397.5632346-429-129363466336766/AnsiballZ_stat.py'
Nov 28 17:19:57 compute-0 sudo[162992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:58 compute-0 python3.9[162994]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:19:58 compute-0 sudo[162992]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:58 compute-0 sudo[163144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqunieahfzpfcujvypdtwwdwrtonkbyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350398.3693576-445-58450336484533/AnsiballZ_stat.py'
Nov 28 17:19:58 compute-0 sudo[163144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:58 compute-0 python3.9[163146]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:19:58 compute-0 sudo[163144]: pam_unix(sudo:session): session closed for user root
Nov 28 17:19:59 compute-0 sudo[163267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnbosgsllpjqkahlluglcikzupipnkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350398.3693576-445-58450336484533/AnsiballZ_copy.py'
Nov 28 17:19:59 compute-0 sudo[163267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:19:59 compute-0 python3.9[163269]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350398.3693576-445-58450336484533/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:19:59 compute-0 sudo[163267]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:00 compute-0 sudo[163419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeeodaucbdwlcsmqgbiiupnfmlujjnaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350399.8344676-475-23220162776918/AnsiballZ_command.py'
Nov 28 17:20:00 compute-0 sudo[163419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:00 compute-0 python3.9[163421]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:20:00 compute-0 sudo[163419]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:00 compute-0 sudo[163572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juljtedtsbdesdfotbanccrvudtfbxfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350400.5217762-491-41494424394618/AnsiballZ_lineinfile.py'
Nov 28 17:20:00 compute-0 sudo[163572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:01 compute-0 python3.9[163574]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:01 compute-0 sudo[163572]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:01 compute-0 sudo[163724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uydmxnowizmhfcrnuhnbyetegxncyesh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350401.2716906-507-96277741191739/AnsiballZ_replace.py'
Nov 28 17:20:01 compute-0 sudo[163724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:01 compute-0 python3.9[163726]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:01 compute-0 sudo[163724]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:02 compute-0 podman[163850]: 2025-11-28 17:20:02.797820082 +0000 UTC m=+0.067507879 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:20:02 compute-0 sudo[163892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqbawdptjjwozlcqkrmezxywxpjlipo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350402.4785986-523-134705063673177/AnsiballZ_replace.py'
Nov 28 17:20:02 compute-0 sudo[163892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:02 compute-0 python3.9[163896]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:03 compute-0 sudo[163892]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:03 compute-0 sudo[164047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtcwjmssbllibyrwimtrgblilbacnlyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350403.2699103-541-51703461089081/AnsiballZ_lineinfile.py'
Nov 28 17:20:03 compute-0 sudo[164047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:03 compute-0 python3.9[164049]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:03 compute-0 sudo[164047]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:04 compute-0 sudo[164199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bikqdgmujlcsbsairsadolrwrjnbdxmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350403.989624-541-138250439640339/AnsiballZ_lineinfile.py'
Nov 28 17:20:04 compute-0 sudo[164199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:04 compute-0 python3.9[164201]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:04 compute-0 sudo[164199]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:04 compute-0 sudo[164351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkvudnpugvmrfbqmtgjlausqxttqpbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350404.6410456-541-34324066832862/AnsiballZ_lineinfile.py'
Nov 28 17:20:04 compute-0 sudo[164351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:05 compute-0 python3.9[164353]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:05 compute-0 sudo[164351]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:05 compute-0 sudo[164503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhdfxtxxajbyhiyzbunarvsxhvnystru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350405.451145-541-182330729021376/AnsiballZ_lineinfile.py'
Nov 28 17:20:05 compute-0 sudo[164503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:05 compute-0 python3.9[164505]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:05 compute-0 sudo[164503]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:06 compute-0 podman[164553]: 2025-11-28 17:20:06.24956842 +0000 UTC m=+0.105809602 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Nov 28 17:20:06 compute-0 sudo[164681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkaeedkqdyrsimehhrweackvweyvmay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350406.1304727-599-6091965048864/AnsiballZ_stat.py'
Nov 28 17:20:06 compute-0 sudo[164681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:07 compute-0 python3.9[164683]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:20:07 compute-0 sudo[164681]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:07 compute-0 sudo[164835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etxothjhhymnadxjczfsflyihhysuwtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350407.397655-615-22217897182018/AnsiballZ_file.py'
Nov 28 17:20:07 compute-0 sudo[164835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:07 compute-0 python3.9[164837]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:07 compute-0 sudo[164835]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:08 compute-0 sudo[164987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jficepvwqznyxravjmwefdtwhkhjmzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350408.229883-633-164402764518577/AnsiballZ_file.py'
Nov 28 17:20:08 compute-0 sudo[164987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:08 compute-0 python3.9[164989]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:08 compute-0 sudo[164987]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:09 compute-0 sudo[165139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhpnebbjzfonsdxyoxspfbwazvmazlrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350408.9393146-649-281360098317905/AnsiballZ_stat.py'
Nov 28 17:20:09 compute-0 sudo[165139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:09 compute-0 python3.9[165141]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:09 compute-0 sudo[165139]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:09 compute-0 sudo[165217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbyolrtkjhtzenmtbuohmwqiqsrtnscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350408.9393146-649-281360098317905/AnsiballZ_file.py'
Nov 28 17:20:09 compute-0 sudo[165217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:09 compute-0 python3.9[165219]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:09 compute-0 sudo[165217]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:10 compute-0 sudo[165369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvzzdpppnvaviqsrenxgfyfkrmdlbbhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350409.973884-649-38329263399740/AnsiballZ_stat.py'
Nov 28 17:20:10 compute-0 sudo[165369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:10 compute-0 python3.9[165371]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:10 compute-0 sudo[165369]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:10 compute-0 sudo[165447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iihwklvcpijlqyrrrpxenlmfqknslaml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350409.973884-649-38329263399740/AnsiballZ_file.py'
Nov 28 17:20:10 compute-0 sudo[165447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:10 compute-0 python3.9[165449]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:10 compute-0 sudo[165447]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:11 compute-0 sudo[165599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srtbtygtqjaezlsvltrgjxwrleicbbyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350411.605197-695-139496337894642/AnsiballZ_file.py'
Nov 28 17:20:11 compute-0 sudo[165599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:12 compute-0 python3.9[165601]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:12 compute-0 sudo[165599]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:12 compute-0 sudo[165751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oymzcpqubiksmgzponvlncstiqwxqyha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350412.355752-711-63047219022679/AnsiballZ_stat.py'
Nov 28 17:20:12 compute-0 sudo[165751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:12 compute-0 python3.9[165753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:12 compute-0 sudo[165751]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:13 compute-0 sudo[165829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvwvucmhtsadbfxruqzyhtaxveyhbtua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350412.355752-711-63047219022679/AnsiballZ_file.py'
Nov 28 17:20:13 compute-0 sudo[165829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:13 compute-0 python3.9[165831]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:13 compute-0 sudo[165829]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:13 compute-0 sudo[165981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfpjbgdgoiwisilxatecyjahsquoxcbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350413.551487-735-121645323868349/AnsiballZ_stat.py'
Nov 28 17:20:13 compute-0 sudo[165981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:14 compute-0 python3.9[165983]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:14 compute-0 sudo[165981]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:14 compute-0 sudo[166059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irxdfdemkotmvfvsbckitzpzwdclqbvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350413.551487-735-121645323868349/AnsiballZ_file.py'
Nov 28 17:20:14 compute-0 sudo[166059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:14 compute-0 python3.9[166061]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:14 compute-0 sudo[166059]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:15 compute-0 sudo[166211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocdrbneeymdavgyzmqgusoyvwptkvchd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350414.9423864-759-102003400456219/AnsiballZ_systemd.py'
Nov 28 17:20:15 compute-0 sudo[166211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:15 compute-0 python3.9[166213]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:15 compute-0 systemd[1]: Reloading.
Nov 28 17:20:15 compute-0 systemd-rc-local-generator[166242]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:15 compute-0 systemd-sysv-generator[166245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:15 compute-0 sudo[166211]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:16 compute-0 sudo[166401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jafnvqxrwqqygelfiseltztaxjcrobun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350416.1317697-775-176982253795186/AnsiballZ_stat.py'
Nov 28 17:20:16 compute-0 sudo[166401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:16 compute-0 python3.9[166403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:16 compute-0 sudo[166401]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:16 compute-0 sudo[166479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxlujhwuhctsygsntypnbslcpwieonii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350416.1317697-775-176982253795186/AnsiballZ_file.py'
Nov 28 17:20:16 compute-0 sudo[166479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:17 compute-0 python3.9[166481]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:17 compute-0 sudo[166479]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:17 compute-0 sudo[166631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lloicqpwwzpqdzjqyeyjecoyoxnvzjta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350417.2922256-799-14360536145219/AnsiballZ_stat.py'
Nov 28 17:20:17 compute-0 sudo[166631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:17 compute-0 python3.9[166633]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:17 compute-0 sudo[166631]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:18 compute-0 sudo[166709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wopguckohelnlefdmkrlrybvaztgfjcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350417.2922256-799-14360536145219/AnsiballZ_file.py'
Nov 28 17:20:18 compute-0 sudo[166709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:18 compute-0 python3.9[166711]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:18 compute-0 sudo[166709]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:18 compute-0 sudo[166861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewfztoqacwouipsnyopbetazmxinckkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350418.4445715-823-86345647563623/AnsiballZ_systemd.py'
Nov 28 17:20:18 compute-0 sudo[166861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:18 compute-0 python3.9[166863]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:19 compute-0 systemd[1]: Reloading.
Nov 28 17:20:19 compute-0 systemd-sysv-generator[166892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:19 compute-0 systemd-rc-local-generator[166889]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:19 compute-0 systemd[1]: Starting Create netns directory...
Nov 28 17:20:19 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 17:20:19 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 17:20:19 compute-0 systemd[1]: Finished Create netns directory.
Nov 28 17:20:19 compute-0 sudo[166861]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:19 compute-0 sudo[167054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqjgcrblvvnbkfyextiqgzmpokouvuft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350419.6801744-843-280790572105491/AnsiballZ_file.py'
Nov 28 17:20:19 compute-0 sudo[167054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:20 compute-0 python3.9[167056]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:20 compute-0 sudo[167054]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:20 compute-0 sudo[167206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yechlvzyclodoisnosavhfbjzoectppk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350420.370692-859-4440572220467/AnsiballZ_stat.py'
Nov 28 17:20:20 compute-0 sudo[167206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:20 compute-0 python3.9[167208]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:20 compute-0 sudo[167206]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:21 compute-0 sudo[167329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przghcjlaknaprnemeztpdqjtqqqwool ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350420.370692-859-4440572220467/AnsiballZ_copy.py'
Nov 28 17:20:21 compute-0 sudo[167329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:21 compute-0 python3.9[167331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350420.370692-859-4440572220467/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:21 compute-0 sudo[167329]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:22 compute-0 sudo[167481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcesdllklnofjhqckpkbdqdvyevlbvhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350421.8616688-893-268132073334480/AnsiballZ_file.py'
Nov 28 17:20:22 compute-0 sudo[167481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:22 compute-0 python3.9[167483]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:20:22 compute-0 sudo[167481]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:22 compute-0 sudo[167633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyphadmzpjeomhvrqgyygmdcxcjfvfay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350422.4840426-909-14268475140450/AnsiballZ_stat.py'
Nov 28 17:20:22 compute-0 sudo[167633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:23 compute-0 python3.9[167635]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:23 compute-0 sudo[167633]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:23 compute-0 sudo[167756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvaajpiiucgsedkacsnfykrvkaowfuln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350422.4840426-909-14268475140450/AnsiballZ_copy.py'
Nov 28 17:20:23 compute-0 sudo[167756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:23 compute-0 python3.9[167758]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350422.4840426-909-14268475140450/.source.json _original_basename=.6po2x8eg follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:23 compute-0 sudo[167756]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:24 compute-0 sudo[167908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqizbvkldegunhtwwfrybmuirroirifo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350423.7907124-939-241851588088216/AnsiballZ_file.py'
Nov 28 17:20:24 compute-0 sudo[167908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:24 compute-0 python3.9[167910]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:24 compute-0 sudo[167908]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:24 compute-0 sudo[168060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zssbisdjxqmibtxmdxjbwzutmdwdoiiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350424.6132324-955-263360058325701/AnsiballZ_stat.py'
Nov 28 17:20:24 compute-0 sudo[168060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:25 compute-0 sudo[168060]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:25 compute-0 sudo[168183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwdyfzpryqbsmuthsigvekrlbjndaxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350424.6132324-955-263360058325701/AnsiballZ_copy.py'
Nov 28 17:20:25 compute-0 sudo[168183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:25 compute-0 sudo[168183]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:26 compute-0 sudo[168335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utcdrmuuxpkimcutxmupcetomtiedddv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350426.0943654-989-192682700674235/AnsiballZ_container_config_data.py'
Nov 28 17:20:26 compute-0 sudo[168335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:26 compute-0 python3.9[168337]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 17:20:26 compute-0 sudo[168335]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:27 compute-0 sudo[168487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buzfdxwwbvunefyjisdxlxukscfkilvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350427.1109016-1007-266986725196316/AnsiballZ_container_config_hash.py'
Nov 28 17:20:27 compute-0 sudo[168487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:20:27.661 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:20:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:20:27.664 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:20:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:20:27.664 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:20:27 compute-0 python3.9[168489]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:20:27 compute-0 sudo[168487]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:28 compute-0 sudo[168639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfbkwxumcbmheojshhkabejrhvtuptyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350428.041103-1025-79245245646729/AnsiballZ_podman_container_info.py'
Nov 28 17:20:28 compute-0 sudo[168639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:28 compute-0 python3.9[168641]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 17:20:28 compute-0 sudo[168639]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:29 compute-0 sudo[168818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhzsyxnggeocisgdtzizgwzocjijwkct ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350429.4588985-1051-98457723801614/AnsiballZ_edpm_container_manage.py'
Nov 28 17:20:29 compute-0 sudo[168818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:30 compute-0 python3[168820]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:20:30 compute-0 podman[168856]: 2025-11-28 17:20:30.36567397 +0000 UTC m=+0.049009857 container create 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Nov 28 17:20:30 compute-0 podman[168856]: 2025-11-28 17:20:30.336605238 +0000 UTC m=+0.019941155 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 17:20:30 compute-0 python3[168820]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 17:20:30 compute-0 sudo[168818]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:30 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 28 17:20:30 compute-0 sudo[169045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkshpskdwkqpqnqizhigmkxjkrhprvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350430.7007647-1067-2168771112583/AnsiballZ_stat.py'
Nov 28 17:20:30 compute-0 sudo[169045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:31 compute-0 python3.9[169047]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:20:31 compute-0 sudo[169045]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:31 compute-0 sudo[169199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acbqbzyzcsbvftxddwvnzqfhjvjpssmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350431.475216-1085-266859094452386/AnsiballZ_file.py'
Nov 28 17:20:31 compute-0 sudo[169199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:31 compute-0 python3.9[169201]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:31 compute-0 sudo[169199]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:32 compute-0 sudo[169275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtajwqfnepawhiqdvqrmtooillvwjbpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350431.475216-1085-266859094452386/AnsiballZ_stat.py'
Nov 28 17:20:32 compute-0 sudo[169275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:32 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:20:32 compute-0 python3.9[169277]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:20:32 compute-0 sudo[169275]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:32 compute-0 sudo[169427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgbodwrztjhabpkiuneiikonqyvxshse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350432.4156408-1085-19499243898475/AnsiballZ_copy.py'
Nov 28 17:20:32 compute-0 sudo[169427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:32 compute-0 podman[169429]: 2025-11-28 17:20:32.945581024 +0000 UTC m=+0.061043209 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:20:33 compute-0 python3.9[169430]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350432.4156408-1085-19499243898475/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:33 compute-0 sudo[169427]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:33 compute-0 sudo[169519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrzajnjokjydvhlselftwxkxlivsunks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350432.4156408-1085-19499243898475/AnsiballZ_systemd.py'
Nov 28 17:20:33 compute-0 sudo[169519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:33 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 28 17:20:33 compute-0 python3.9[169521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:20:33 compute-0 systemd[1]: Reloading.
Nov 28 17:20:33 compute-0 systemd-rc-local-generator[169552]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:33 compute-0 systemd-sysv-generator[169556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:34 compute-0 sudo[169519]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:34 compute-0 sudo[169631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpdkggziuyfxrxqyjikfvuyuspgcpli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350432.4156408-1085-19499243898475/AnsiballZ_systemd.py'
Nov 28 17:20:34 compute-0 sudo[169631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:34 compute-0 python3.9[169633]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:34 compute-0 systemd[1]: Reloading.
Nov 28 17:20:34 compute-0 systemd-rc-local-generator[169665]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:34 compute-0 systemd-sysv-generator[169669]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:34 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 17:20:34 compute-0 systemd[1]: Starting multipathd container...
Nov 28 17:20:35 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:20:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f86196bcf5bba18c39beda305b9413b72a210dfa82833c4ebe0fd4ede370c9ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 17:20:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f86196bcf5bba18c39beda305b9413b72a210dfa82833c4ebe0fd4ede370c9ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 17:20:35 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.
Nov 28 17:20:35 compute-0 podman[169675]: 2025-11-28 17:20:35.352964205 +0000 UTC m=+0.340074874 container init 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:20:35 compute-0 multipathd[169690]: + sudo -E kolla_set_configs
Nov 28 17:20:35 compute-0 podman[169675]: 2025-11-28 17:20:35.397582782 +0000 UTC m=+0.384693441 container start 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:20:35 compute-0 sudo[169697]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 17:20:35 compute-0 sudo[169697]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 17:20:35 compute-0 sudo[169697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 17:20:35 compute-0 podman[169675]: multipathd
Nov 28 17:20:35 compute-0 systemd[1]: Started multipathd container.
Nov 28 17:20:35 compute-0 multipathd[169690]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:20:35 compute-0 multipathd[169690]: INFO:__main__:Validating config file
Nov 28 17:20:35 compute-0 multipathd[169690]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:20:35 compute-0 multipathd[169690]: INFO:__main__:Writing out command to execute
Nov 28 17:20:35 compute-0 sudo[169697]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:35 compute-0 sudo[169631]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:35 compute-0 multipathd[169690]: ++ cat /run_command
Nov 28 17:20:35 compute-0 multipathd[169690]: + CMD='/usr/sbin/multipathd -d'
Nov 28 17:20:35 compute-0 multipathd[169690]: + ARGS=
Nov 28 17:20:35 compute-0 multipathd[169690]: + sudo kolla_copy_cacerts
Nov 28 17:20:35 compute-0 sudo[169718]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 17:20:35 compute-0 sudo[169718]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 17:20:35 compute-0 sudo[169718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 17:20:35 compute-0 sudo[169718]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:35 compute-0 multipathd[169690]: + [[ ! -n '' ]]
Nov 28 17:20:35 compute-0 multipathd[169690]: + . kolla_extend_start
Nov 28 17:20:35 compute-0 multipathd[169690]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 17:20:35 compute-0 multipathd[169690]: Running command: '/usr/sbin/multipathd -d'
Nov 28 17:20:35 compute-0 multipathd[169690]: + umask 0022
Nov 28 17:20:35 compute-0 multipathd[169690]: + exec /usr/sbin/multipathd -d
Nov 28 17:20:35 compute-0 podman[169696]: 2025-11-28 17:20:35.486586949 +0000 UTC m=+0.074621786 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:20:35 compute-0 systemd[1]: 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-2a2d943b237d2a0b.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 17:20:35 compute-0 systemd[1]: 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-2a2d943b237d2a0b.service: Failed with result 'exit-code'.
Nov 28 17:20:35 compute-0 multipathd[169690]: 3610.350342 | --------start up--------
Nov 28 17:20:35 compute-0 multipathd[169690]: 3610.350364 | read /etc/multipath.conf
Nov 28 17:20:35 compute-0 multipathd[169690]: 3610.356989 | path checkers start up
Nov 28 17:20:36 compute-0 python3.9[169880]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:20:36 compute-0 sudo[170043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-burflitpfxokblkafmbxhglcpcydusdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350436.3112376-1157-278695687884595/AnsiballZ_command.py'
Nov 28 17:20:36 compute-0 sudo[170043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:36 compute-0 podman[170006]: 2025-11-28 17:20:36.722914351 +0000 UTC m=+0.140237550 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:20:36 compute-0 python3.9[170049]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:20:36 compute-0 sudo[170043]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:37 compute-0 sudo[170224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjhxrnflhoghkxgltskzvbarcntribwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350437.1173284-1173-145412472372963/AnsiballZ_systemd.py'
Nov 28 17:20:37 compute-0 sudo[170224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:37 compute-0 python3.9[170226]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:20:37 compute-0 systemd[1]: Stopping multipathd container...
Nov 28 17:20:37 compute-0 multipathd[169690]: 3612.687943 | exit (signal)
Nov 28 17:20:37 compute-0 multipathd[169690]: 3612.688028 | --------shut down-------
Nov 28 17:20:37 compute-0 systemd[1]: libpod-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.scope: Deactivated successfully.
Nov 28 17:20:37 compute-0 podman[170230]: 2025-11-28 17:20:37.868060511 +0000 UTC m=+0.088584686 container died 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 17:20:37 compute-0 systemd[1]: 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-2a2d943b237d2a0b.timer: Deactivated successfully.
Nov 28 17:20:37 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.
Nov 28 17:20:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-userdata-shm.mount: Deactivated successfully.
Nov 28 17:20:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-f86196bcf5bba18c39beda305b9413b72a210dfa82833c4ebe0fd4ede370c9ca-merged.mount: Deactivated successfully.
Nov 28 17:20:37 compute-0 podman[170230]: 2025-11-28 17:20:37.916771898 +0000 UTC m=+0.137296103 container cleanup 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 17:20:37 compute-0 podman[170230]: multipathd
Nov 28 17:20:37 compute-0 podman[170261]: multipathd
Nov 28 17:20:37 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 28 17:20:37 compute-0 systemd[1]: Stopped multipathd container.
Nov 28 17:20:37 compute-0 systemd[1]: Starting multipathd container...
Nov 28 17:20:38 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:20:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f86196bcf5bba18c39beda305b9413b72a210dfa82833c4ebe0fd4ede370c9ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 17:20:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f86196bcf5bba18c39beda305b9413b72a210dfa82833c4ebe0fd4ede370c9ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 17:20:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.
Nov 28 17:20:38 compute-0 podman[170274]: 2025-11-28 17:20:38.087547951 +0000 UTC m=+0.093998065 container init 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 17:20:38 compute-0 multipathd[170290]: + sudo -E kolla_set_configs
Nov 28 17:20:38 compute-0 podman[170274]: 2025-11-28 17:20:38.110430652 +0000 UTC m=+0.116880746 container start 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 17:20:38 compute-0 sudo[170296]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 28 17:20:38 compute-0 sudo[170296]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 17:20:38 compute-0 podman[170274]: multipathd
Nov 28 17:20:38 compute-0 sudo[170296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 17:20:38 compute-0 systemd[1]: Started multipathd container.
Nov 28 17:20:38 compute-0 sudo[170224]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:38 compute-0 multipathd[170290]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:20:38 compute-0 multipathd[170290]: INFO:__main__:Validating config file
Nov 28 17:20:38 compute-0 multipathd[170290]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:20:38 compute-0 multipathd[170290]: INFO:__main__:Writing out command to execute
Nov 28 17:20:38 compute-0 sudo[170296]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:38 compute-0 multipathd[170290]: ++ cat /run_command
Nov 28 17:20:38 compute-0 multipathd[170290]: + CMD='/usr/sbin/multipathd -d'
Nov 28 17:20:38 compute-0 multipathd[170290]: + ARGS=
Nov 28 17:20:38 compute-0 multipathd[170290]: + sudo kolla_copy_cacerts
Nov 28 17:20:38 compute-0 sudo[170317]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 28 17:20:38 compute-0 sudo[170317]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 28 17:20:38 compute-0 sudo[170317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 28 17:20:38 compute-0 podman[170297]: 2025-11-28 17:20:38.18171001 +0000 UTC m=+0.058654429 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 17:20:38 compute-0 sudo[170317]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:38 compute-0 multipathd[170290]: + [[ ! -n '' ]]
Nov 28 17:20:38 compute-0 multipathd[170290]: + . kolla_extend_start
Nov 28 17:20:38 compute-0 multipathd[170290]: Running command: '/usr/sbin/multipathd -d'
Nov 28 17:20:38 compute-0 multipathd[170290]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 17:20:38 compute-0 multipathd[170290]: + umask 0022
Nov 28 17:20:38 compute-0 multipathd[170290]: + exec /usr/sbin/multipathd -d
Nov 28 17:20:38 compute-0 systemd[1]: 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-181b025a2efcd426.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 17:20:38 compute-0 systemd[1]: 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176-181b025a2efcd426.service: Failed with result 'exit-code'.
Nov 28 17:20:38 compute-0 multipathd[170290]: 3613.050983 | --------start up--------
Nov 28 17:20:38 compute-0 multipathd[170290]: 3613.051014 | read /etc/multipath.conf
Nov 28 17:20:38 compute-0 multipathd[170290]: 3613.056669 | path checkers start up
Nov 28 17:20:38 compute-0 sudo[170478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otdizdytvcgegwfjpxnwqkqkahhdqjwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350438.3276343-1189-172039246324794/AnsiballZ_file.py'
Nov 28 17:20:38 compute-0 sudo[170478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:38 compute-0 python3.9[170480]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:38 compute-0 sudo[170478]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:39 compute-0 sudo[170630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pffrusyilakenvgtybibrrppuwxlkghj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350439.2986374-1213-7470197449942/AnsiballZ_file.py'
Nov 28 17:20:39 compute-0 sudo[170630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:39 compute-0 python3.9[170632]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 17:20:39 compute-0 sudo[170630]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:40 compute-0 sudo[170782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eijwghjxeyapwhymmjzyrtijkbtplyhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350440.001761-1229-184490150385314/AnsiballZ_modprobe.py'
Nov 28 17:20:40 compute-0 sudo[170782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:40 compute-0 python3.9[170784]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 17:20:40 compute-0 kernel: Key type psk registered
Nov 28 17:20:40 compute-0 sudo[170782]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:40 compute-0 sudo[170946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvygcvwygeujkdiwstfabokobwmqweej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350440.6855378-1245-128669722659216/AnsiballZ_stat.py'
Nov 28 17:20:40 compute-0 sudo[170946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:41 compute-0 python3.9[170948]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:20:41 compute-0 sudo[170946]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:41 compute-0 sudo[171069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwurzlppsgycapyaxwbwavnvofxdbule ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350440.6855378-1245-128669722659216/AnsiballZ_copy.py'
Nov 28 17:20:41 compute-0 sudo[171069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:41 compute-0 python3.9[171071]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350440.6855378-1245-128669722659216/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:41 compute-0 sudo[171069]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:42 compute-0 sudo[171221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzidviuzggdacuokaylnxnembdpuupgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350442.0217803-1277-55439442364743/AnsiballZ_lineinfile.py'
Nov 28 17:20:42 compute-0 sudo[171221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:42 compute-0 python3.9[171223]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:42 compute-0 sudo[171221]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:42 compute-0 sudo[171373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvyuxjdnbnbsixiepjxybnhsehsgumto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350442.6823905-1293-45101340600175/AnsiballZ_systemd.py'
Nov 28 17:20:42 compute-0 sudo[171373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:43 compute-0 python3.9[171375]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:20:43 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 17:20:43 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 28 17:20:43 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 28 17:20:43 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 28 17:20:43 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 28 17:20:43 compute-0 sudo[171373]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:43 compute-0 sudo[171529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpbnegljnlfezlfcfjgmgvjavvfewtya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350443.6949737-1309-197351201832368/AnsiballZ_dnf.py'
Nov 28 17:20:43 compute-0 sudo[171529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:44 compute-0 python3.9[171531]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 17:20:46 compute-0 systemd[1]: Reloading.
Nov 28 17:20:46 compute-0 systemd-rc-local-generator[171564]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:46 compute-0 systemd-sysv-generator[171567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:46 compute-0 systemd[1]: Reloading.
Nov 28 17:20:46 compute-0 systemd-rc-local-generator[171599]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:46 compute-0 systemd-sysv-generator[171602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:47 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 17:20:47 compute-0 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 17:20:47 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 17:20:47 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 28 17:20:47 compute-0 systemd[1]: Reloading.
Nov 28 17:20:47 compute-0 systemd-rc-local-generator[171691]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:47 compute-0 systemd-sysv-generator[171695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:47 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 17:20:48 compute-0 sudo[171529]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:48 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 17:20:48 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 28 17:20:48 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.589s CPU time.
Nov 28 17:20:48 compute-0 systemd[1]: run-r9fbea481c77744998e2099145a0f5f43.service: Deactivated successfully.
Nov 28 17:20:48 compute-0 sudo[172983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bibpevumopxdrogbnowgimqdeecqdajr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350448.675289-1325-20028779567623/AnsiballZ_systemd_service.py'
Nov 28 17:20:48 compute-0 sudo[172983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:49 compute-0 python3.9[172985]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:20:49 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 28 17:20:49 compute-0 iscsid[161348]: iscsid shutting down.
Nov 28 17:20:49 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 28 17:20:49 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 28 17:20:49 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 17:20:49 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 28 17:20:49 compute-0 systemd[1]: Started Open-iSCSI.
Nov 28 17:20:49 compute-0 sudo[172983]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:50 compute-0 python3.9[173140]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:20:51 compute-0 sudo[173294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knlvqlkzkbnvuuldkuezthtwmcanatge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350450.7727375-1360-267722336733864/AnsiballZ_file.py'
Nov 28 17:20:51 compute-0 sudo[173294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:51 compute-0 python3.9[173296]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:20:51 compute-0 sudo[173294]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:52 compute-0 sudo[173446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-darynfmdyxnaqrkotibeiqskmlcvglkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350451.735516-1382-251301625725661/AnsiballZ_systemd_service.py'
Nov 28 17:20:52 compute-0 sudo[173446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:52 compute-0 python3.9[173448]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:20:52 compute-0 systemd[1]: Reloading.
Nov 28 17:20:52 compute-0 systemd-sysv-generator[173479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:20:52 compute-0 systemd-rc-local-generator[173476]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:20:52 compute-0 sudo[173446]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:53 compute-0 python3.9[173633]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:20:53 compute-0 network[173650]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:20:53 compute-0 network[173651]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:20:53 compute-0 network[173652]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:20:57 compute-0 sudo[173924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpzxposupoesdaychpsignmxoqysenam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350457.0241625-1420-200316337097430/AnsiballZ_systemd_service.py'
Nov 28 17:20:57 compute-0 sudo[173924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:57 compute-0 python3.9[173926]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:57 compute-0 sudo[173924]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:58 compute-0 sudo[174077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwxcsgqdvyusziaznkkiffulzfutiexi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350457.9579802-1420-178606938128218/AnsiballZ_systemd_service.py'
Nov 28 17:20:58 compute-0 sudo[174077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:58 compute-0 python3.9[174079]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:58 compute-0 sudo[174077]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:58 compute-0 sudo[174230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwqscfgeeedfkdveinqoryynmgvzihdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350458.7151744-1420-180211592697040/AnsiballZ_systemd_service.py'
Nov 28 17:20:58 compute-0 sudo[174230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:20:59 compute-0 python3.9[174232]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:20:59 compute-0 sudo[174230]: pam_unix(sudo:session): session closed for user root
Nov 28 17:20:59 compute-0 sudo[174383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-davtffsvgpqtjgoetgqgpnurbilqtsfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350459.4691546-1420-16356989883215/AnsiballZ_systemd_service.py'
Nov 28 17:20:59 compute-0 sudo[174383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:00 compute-0 python3.9[174385]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:21:00 compute-0 sudo[174383]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:00 compute-0 sudo[174536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgppmniggzosvtpgwkezwyostbzwltlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350460.1855624-1420-276921930171442/AnsiballZ_systemd_service.py'
Nov 28 17:21:00 compute-0 sudo[174536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:00 compute-0 python3.9[174538]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:21:00 compute-0 sudo[174536]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:01 compute-0 sudo[174689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yysaxbfvoyhvckpzrsbtyzytoueafdkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350460.9544258-1420-182396511053704/AnsiballZ_systemd_service.py'
Nov 28 17:21:01 compute-0 sudo[174689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:01 compute-0 python3.9[174691]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:21:01 compute-0 sudo[174689]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:02 compute-0 sudo[174842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huqezmfmdpzfsmgshfkcmlbggoibjlsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350461.8155081-1420-195927378676887/AnsiballZ_systemd_service.py'
Nov 28 17:21:02 compute-0 sudo[174842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:02 compute-0 python3.9[174844]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:21:02 compute-0 sudo[174842]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:03 compute-0 sudo[174995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygqeyefzomygzormdzewxnthmeocullo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350462.6476538-1420-199434547170252/AnsiballZ_systemd_service.py'
Nov 28 17:21:03 compute-0 sudo[174995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:03 compute-0 podman[174997]: 2025-11-28 17:21:03.120997846 +0000 UTC m=+0.083655463 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:21:03 compute-0 python3.9[174998]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:21:03 compute-0 sudo[174995]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:04 compute-0 sudo[175168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfjflkpkaeotcdpgdpprernoypcojfrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350463.844589-1538-245960562019357/AnsiballZ_file.py'
Nov 28 17:21:04 compute-0 sudo[175168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:04 compute-0 python3.9[175170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:04 compute-0 sudo[175168]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:04 compute-0 sudo[175320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-golmyoumsniakclaesyvxngyojyeelyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350464.5273952-1538-62780824043687/AnsiballZ_file.py'
Nov 28 17:21:04 compute-0 sudo[175320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:05 compute-0 python3.9[175322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:05 compute-0 sudo[175320]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:05 compute-0 sudo[175472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prxpogwsjxhqqxrenkpholgufztrphxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350465.237395-1538-181446517967136/AnsiballZ_file.py'
Nov 28 17:21:05 compute-0 sudo[175472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:06 compute-0 python3.9[175474]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:06 compute-0 sudo[175472]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:06 compute-0 sudo[175624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkhfxxevsklxedjfbarjvgvkyaqutyfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350466.2058358-1538-219230776049403/AnsiballZ_file.py'
Nov 28 17:21:06 compute-0 sudo[175624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:06 compute-0 python3.9[175626]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:06 compute-0 sudo[175624]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:07 compute-0 sudo[175789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvxwwahqqgwmkewqvngmjlrxlhdxgggk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350466.9175477-1538-36200476592212/AnsiballZ_file.py'
Nov 28 17:21:07 compute-0 sudo[175789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:07 compute-0 podman[175749]: 2025-11-28 17:21:07.278070981 +0000 UTC m=+0.125699811 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:21:07 compute-0 python3.9[175797]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:07 compute-0 sudo[175789]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:07 compute-0 sudo[175954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebnojzjjdamkazzktkyqxdzaoonfgkzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350467.596966-1538-130957810182304/AnsiballZ_file.py'
Nov 28 17:21:07 compute-0 sudo[175954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:08 compute-0 python3.9[175956]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:08 compute-0 sudo[175954]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:08 compute-0 sudo[176122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzoyobyhkzodggijmsxkgocjsmzonrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350468.3024392-1538-32329613234560/AnsiballZ_file.py'
Nov 28 17:21:08 compute-0 sudo[176122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:08 compute-0 podman[176080]: 2025-11-28 17:21:08.709965147 +0000 UTC m=+0.082375156 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 17:21:08 compute-0 python3.9[176127]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:08 compute-0 sudo[176122]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:09 compute-0 sudo[176277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djoceivcabapdinqmeydpmbrrzsosvtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350469.034067-1538-130632579245839/AnsiballZ_file.py'
Nov 28 17:21:09 compute-0 sudo[176277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:09 compute-0 python3.9[176279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:09 compute-0 sudo[176277]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:10 compute-0 sudo[176429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqykurbzxerhctqtdwnbnrmrdvwesnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350469.9047322-1652-124169746753440/AnsiballZ_file.py'
Nov 28 17:21:10 compute-0 sudo[176429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:10 compute-0 python3.9[176431]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:10 compute-0 sudo[176429]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:10 compute-0 sudo[176581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fplpnsqisvlxckzzhgittxtukhmhgoam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350470.6195064-1652-118932876907544/AnsiballZ_file.py'
Nov 28 17:21:10 compute-0 sudo[176581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:11 compute-0 python3.9[176583]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:11 compute-0 sudo[176581]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:11 compute-0 sudo[176733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heqxhekofwtkuihhlfzvlaoietstnubi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350471.3021836-1652-133493119848520/AnsiballZ_file.py'
Nov 28 17:21:11 compute-0 sudo[176733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:11 compute-0 python3.9[176735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:11 compute-0 sudo[176733]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:12 compute-0 sudo[176885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lszfunksobmonypfyckzfdsetvltiihr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350472.003264-1652-22495933682588/AnsiballZ_file.py'
Nov 28 17:21:12 compute-0 sudo[176885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:12 compute-0 python3.9[176887]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:12 compute-0 sudo[176885]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:12 compute-0 sudo[177037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqxcxjcfnsolfcginirydxaywtzyyvho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350472.6762462-1652-175591633086231/AnsiballZ_file.py'
Nov 28 17:21:12 compute-0 sudo[177037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:13 compute-0 python3.9[177039]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:13 compute-0 sudo[177037]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:14 compute-0 sudo[177189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zamwsjrsabbhsfjubpnpncyikscsvkzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350474.0211565-1652-121042032578913/AnsiballZ_file.py'
Nov 28 17:21:14 compute-0 sudo[177189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:14 compute-0 python3.9[177191]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:14 compute-0 sudo[177189]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:15 compute-0 sudo[177341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqxubxajdlddoqtqvfylvypcgdhrpqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350474.7114944-1652-139259320841995/AnsiballZ_file.py'
Nov 28 17:21:15 compute-0 sudo[177341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:15 compute-0 python3.9[177343]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:15 compute-0 sudo[177341]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:16 compute-0 sudo[177493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fslzkmibjuagrsoldjntzostdmxalhpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350475.940601-1652-26665264647455/AnsiballZ_file.py'
Nov 28 17:21:16 compute-0 sudo[177493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:16 compute-0 python3.9[177495]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:16 compute-0 sudo[177493]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:17 compute-0 sudo[177645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdrrvmcilmvdawqqbtbvnzswgpocnzct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350476.7709365-1768-63279014187549/AnsiballZ_command.py'
Nov 28 17:21:17 compute-0 sudo[177645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:17 compute-0 python3.9[177647]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:17 compute-0 sudo[177645]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:18 compute-0 python3.9[177799]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:21:18 compute-0 sudo[177949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwnmxvivkzraekcezvxerargvltrlbkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350478.6061683-1804-251086164722515/AnsiballZ_systemd_service.py'
Nov 28 17:21:18 compute-0 sudo[177949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:19 compute-0 python3.9[177951]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:21:19 compute-0 systemd[1]: Reloading.
Nov 28 17:21:19 compute-0 systemd-rc-local-generator[177977]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:21:19 compute-0 systemd-sysv-generator[177983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:21:19 compute-0 sudo[177949]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:20 compute-0 sudo[178136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jryljekjaxqkfvrkxxazzwepbhoxftcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350479.8260963-1820-87382484012861/AnsiballZ_command.py'
Nov 28 17:21:20 compute-0 sudo[178136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:20 compute-0 python3.9[178138]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:20 compute-0 sudo[178136]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:20 compute-0 sudo[178289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogxqnbcdkrtwwosmfcqehbfnmeufwxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350480.5402794-1820-280667865401458/AnsiballZ_command.py'
Nov 28 17:21:20 compute-0 sudo[178289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:21 compute-0 python3.9[178291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:21 compute-0 sudo[178289]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:21 compute-0 sudo[178442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyaqmvwliviwlcdfgzknsdtwkviqlpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350481.2191992-1820-123034675032650/AnsiballZ_command.py'
Nov 28 17:21:21 compute-0 sudo[178442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:21 compute-0 python3.9[178444]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:21 compute-0 sudo[178442]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:22 compute-0 sudo[178595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpsprcncvnljlwuzdbhbjiiizxrvlpnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350481.884151-1820-265927303165888/AnsiballZ_command.py'
Nov 28 17:21:22 compute-0 sudo[178595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:22 compute-0 python3.9[178597]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:22 compute-0 sudo[178595]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:22 compute-0 sudo[178748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqoxrmqxelzznvtkkdfoasdgwbfgxbnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350482.5466447-1820-46189498757098/AnsiballZ_command.py'
Nov 28 17:21:22 compute-0 sudo[178748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:23 compute-0 python3.9[178750]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:23 compute-0 sudo[178748]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:23 compute-0 sudo[178901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oljpaepnnwalnntosfumpjjegbpgegjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350483.2291777-1820-71962651050574/AnsiballZ_command.py'
Nov 28 17:21:23 compute-0 sudo[178901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:23 compute-0 python3.9[178903]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:23 compute-0 sudo[178901]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:24 compute-0 sudo[179054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdzljmkgiqpntqkhmcyzpdczpkedpelu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350483.9313636-1820-271196230978773/AnsiballZ_command.py'
Nov 28 17:21:24 compute-0 sudo[179054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:24 compute-0 python3.9[179056]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:24 compute-0 sudo[179054]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:24 compute-0 sudo[179207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfqvwudahsdkpdjnjxiabkgtpitfbjjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350484.621086-1820-105897677021511/AnsiballZ_command.py'
Nov 28 17:21:24 compute-0 sudo[179207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:25 compute-0 python3.9[179209]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:21:25 compute-0 sudo[179207]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:26 compute-0 sudo[179360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkdfqmapjjissgimwnxdbwfjtuuubscu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350486.2205184-1963-96448788753836/AnsiballZ_file.py'
Nov 28 17:21:26 compute-0 sudo[179360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:26 compute-0 python3.9[179362]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:26 compute-0 sudo[179360]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:27 compute-0 sudo[179512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cskpjcxgguwqsbrwnteyqphnuxsrubor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350486.9911976-1963-280077420583597/AnsiballZ_file.py'
Nov 28 17:21:27 compute-0 sudo[179512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:27 compute-0 python3.9[179514]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:27 compute-0 sudo[179512]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:21:27.663 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:21:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:21:27.664 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:21:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:21:27.664 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:21:27 compute-0 sudo[179664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqipvowlzcsknazzyrmojjykcszgfvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350487.6931543-1963-265802019583729/AnsiballZ_file.py'
Nov 28 17:21:27 compute-0 sudo[179664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:28 compute-0 python3.9[179666]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:28 compute-0 sudo[179664]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:28 compute-0 sudo[179816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajqhzoqawmjrixemvqqubfcrkruuntp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350488.420864-2007-177185227954495/AnsiballZ_file.py'
Nov 28 17:21:28 compute-0 sudo[179816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:28 compute-0 python3.9[179818]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:29 compute-0 sudo[179816]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:29 compute-0 sudo[179968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flerqudfkvxyyvgibjmrapqwdddakryq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350489.1889687-2007-234462629023444/AnsiballZ_file.py'
Nov 28 17:21:29 compute-0 sudo[179968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:29 compute-0 python3.9[179970]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:29 compute-0 sudo[179968]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:30 compute-0 sudo[180120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bygfolvseijjoznqcyhddqmjvxinhqfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350490.0218394-2007-172606366344115/AnsiballZ_file.py'
Nov 28 17:21:30 compute-0 sudo[180120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:30 compute-0 python3.9[180122]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:30 compute-0 sudo[180120]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:30 compute-0 sudo[180272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etyeypaozwgxvhuliylwhdvnjzbtymgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350490.6979206-2007-127397579028215/AnsiballZ_file.py'
Nov 28 17:21:30 compute-0 sudo[180272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:31 compute-0 python3.9[180274]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:31 compute-0 sudo[180272]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:31 compute-0 sudo[180424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvoraxktpljsiyfugrwwesbyohadhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350491.361471-2007-195883715952097/AnsiballZ_file.py'
Nov 28 17:21:31 compute-0 sudo[180424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:31 compute-0 python3.9[180426]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:31 compute-0 sudo[180424]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:32 compute-0 sudo[180576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpejiyuorowytwhemocbhufwyxouclpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350492.1059053-2007-89450686698001/AnsiballZ_file.py'
Nov 28 17:21:32 compute-0 sudo[180576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:32 compute-0 python3.9[180578]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:32 compute-0 sudo[180576]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:33 compute-0 sudo[180728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygjumnrlztdeakjynhpfmsgkcblyttd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350492.8072882-2007-82906474403601/AnsiballZ_file.py'
Nov 28 17:21:33 compute-0 sudo[180728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:33 compute-0 python3.9[180730]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:33 compute-0 sudo[180728]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:33 compute-0 podman[180731]: 2025-11-28 17:21:33.49502461 +0000 UTC m=+0.123895319 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:21:38 compute-0 sudo[180921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpiqovstbnxtuyfsfcuqewwkxqayssw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350497.7487395-2244-38835689140009/AnsiballZ_getent.py'
Nov 28 17:21:38 compute-0 sudo[180921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:38 compute-0 podman[180852]: 2025-11-28 17:21:38.331919924 +0000 UTC m=+0.178895245 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 17:21:38 compute-0 python3.9[180926]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 17:21:38 compute-0 sudo[180921]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:39 compute-0 sudo[181091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrnqfsqcsbbupiulccousgxxvusyrxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350498.72988-2260-115346479004288/AnsiballZ_group.py'
Nov 28 17:21:39 compute-0 sudo[181091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:39 compute-0 podman[181051]: 2025-11-28 17:21:39.193599332 +0000 UTC m=+0.065342618 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:21:39 compute-0 python3.9[181098]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:21:39 compute-0 groupadd[181099]: group added to /etc/group: name=nova, GID=42436
Nov 28 17:21:39 compute-0 groupadd[181099]: group added to /etc/gshadow: name=nova
Nov 28 17:21:39 compute-0 groupadd[181099]: new group: name=nova, GID=42436
Nov 28 17:21:39 compute-0 sudo[181091]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:40 compute-0 sudo[181254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frfdfbrzieuxctmqtlgrdaltpylehiht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350499.7127752-2276-81249448195808/AnsiballZ_user.py'
Nov 28 17:21:40 compute-0 sudo[181254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:40 compute-0 python3.9[181256]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 17:21:40 compute-0 useradd[181258]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 28 17:21:40 compute-0 useradd[181258]: add 'nova' to group 'libvirt'
Nov 28 17:21:40 compute-0 useradd[181258]: add 'nova' to shadow group 'libvirt'
Nov 28 17:21:40 compute-0 sudo[181254]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:41 compute-0 sshd-session[181289]: Accepted publickey for zuul from 192.168.122.30 port 47436 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:21:41 compute-0 systemd-logind[788]: New session 25 of user zuul.
Nov 28 17:21:41 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 28 17:21:41 compute-0 sshd-session[181289]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:21:42 compute-0 sshd-session[181292]: Received disconnect from 192.168.122.30 port 47436:11: disconnected by user
Nov 28 17:21:42 compute-0 sshd-session[181292]: Disconnected from user zuul 192.168.122.30 port 47436
Nov 28 17:21:42 compute-0 sshd-session[181289]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:21:42 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 28 17:21:42 compute-0 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Nov 28 17:21:42 compute-0 systemd-logind[788]: Removed session 25.
Nov 28 17:21:42 compute-0 python3.9[181442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:43 compute-0 python3.9[181563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350502.2382936-2326-121668322810385/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:44 compute-0 python3.9[181713]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:44 compute-0 python3.9[181789]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:45 compute-0 python3.9[181939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:45 compute-0 python3.9[182060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350504.8079185-2326-174330760981504/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:46 compute-0 python3.9[182210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:47 compute-0 python3.9[182331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350506.0150323-2326-190094337756371/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:47 compute-0 python3.9[182481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:48 compute-0 python3.9[182602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350507.3246586-2326-26806192136766/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:49 compute-0 python3.9[182752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:49 compute-0 python3.9[182873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350508.5118823-2326-261924266486533/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:50 compute-0 sudo[183023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfborefryxwyehncpnklkmtqcbwcxuzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350509.87658-2492-183073021527763/AnsiballZ_file.py'
Nov 28 17:21:50 compute-0 sudo[183023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:50 compute-0 python3.9[183025]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:50 compute-0 sudo[183023]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:50 compute-0 sudo[183175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pekgliftnoddydapauwevlznztlmyimi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350510.6472569-2508-136036178160243/AnsiballZ_copy.py'
Nov 28 17:21:50 compute-0 sudo[183175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:51 compute-0 python3.9[183177]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:21:51 compute-0 sudo[183175]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:51 compute-0 sudo[183327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btyusvdttzgvuwrbfabvtlrsinxtbwgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350511.3763764-2524-230612537728343/AnsiballZ_stat.py'
Nov 28 17:21:51 compute-0 sudo[183327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:51 compute-0 python3.9[183329]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:21:51 compute-0 sudo[183327]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:52 compute-0 sudo[183479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skzsajghbqhidtwotjebqtvxepcdnqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350512.1736064-2540-38313518455343/AnsiballZ_stat.py'
Nov 28 17:21:52 compute-0 sudo[183479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:52 compute-0 python3.9[183481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:52 compute-0 sudo[183479]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:53 compute-0 sudo[183602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-algyyjjhwjvstzyatzlxdvaqquobjpwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350512.1736064-2540-38313518455343/AnsiballZ_copy.py'
Nov 28 17:21:53 compute-0 sudo[183602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:53 compute-0 python3.9[183604]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764350512.1736064-2540-38313518455343/.source _original_basename=.u4343uvx follow=False checksum=e14ab2545c2ded86edcd48f6b187f5f74bf948dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 28 17:21:53 compute-0 sudo[183602]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:54 compute-0 python3.9[183756]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:21:55 compute-0 python3.9[183908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:55 compute-0 python3.9[184029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350514.500234-2592-223904438235949/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:56 compute-0 python3.9[184179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:21:57 compute-0 python3.9[184300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350515.9581435-2622-162210704706465/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:21:57 compute-0 sudo[184450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmpohyztrdyezcdhvsewdzloangqocxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350517.4762573-2656-42757836938432/AnsiballZ_container_config_data.py'
Nov 28 17:21:57 compute-0 sudo[184450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:57 compute-0 python3.9[184452]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 17:21:58 compute-0 sudo[184450]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:58 compute-0 sudo[184602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyqbztnzwchivptqnjrooglmtumjlkum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350518.3174915-2674-55182322987200/AnsiballZ_container_config_hash.py'
Nov 28 17:21:58 compute-0 sudo[184602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:58 compute-0 python3.9[184604]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:21:58 compute-0 sudo[184602]: pam_unix(sudo:session): session closed for user root
Nov 28 17:21:59 compute-0 sudo[184754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iycefmwbzdvhswypmddrhelndsbjnkbm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350519.299074-2694-92082833942758/AnsiballZ_edpm_container_manage.py'
Nov 28 17:21:59 compute-0 sudo[184754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:21:59 compute-0 python3[184756]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:22:00 compute-0 podman[184794]: 2025-11-28 17:22:00.163900449 +0000 UTC m=+0.073863830 container create 5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS)
Nov 28 17:22:00 compute-0 podman[184794]: 2025-11-28 17:22:00.120081424 +0000 UTC m=+0.030044905 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 17:22:00 compute-0 python3[184756]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 28 17:22:00 compute-0 sudo[184754]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:00 compute-0 sudo[184983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbyodevvzkmeclasdrgtasglpkjsxgne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350520.5275395-2710-202196399264790/AnsiballZ_stat.py'
Nov 28 17:22:00 compute-0 sudo[184983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:00 compute-0 python3.9[184985]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:01 compute-0 sudo[184983]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:01 compute-0 sudo[185137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdizuqomlkacafwjrddfrzixwmnifhok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350521.5955114-2734-269834258138931/AnsiballZ_container_config_data.py'
Nov 28 17:22:01 compute-0 sudo[185137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:02 compute-0 python3.9[185139]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 17:22:02 compute-0 sudo[185137]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:02 compute-0 sudo[185289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhqgecvzmyggnjoddvibpgkpzclclywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350522.583318-2752-95772603979091/AnsiballZ_container_config_hash.py'
Nov 28 17:22:02 compute-0 sudo[185289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:03 compute-0 python3.9[185291]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:22:03 compute-0 sudo[185289]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:03 compute-0 sudo[185454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlcqdncpqgkhvycxcmnpiddlmmigdxpd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350523.4605758-2772-151776126739591/AnsiballZ_edpm_container_manage.py'
Nov 28 17:22:03 compute-0 sudo[185454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:03 compute-0 podman[185415]: 2025-11-28 17:22:03.774261752 +0000 UTC m=+0.072501328 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 28 17:22:04 compute-0 python3[185462]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:22:04 compute-0 podman[185499]: 2025-11-28 17:22:04.20930156 +0000 UTC m=+0.022126455 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 17:22:04 compute-0 podman[185499]: 2025-11-28 17:22:04.366327121 +0000 UTC m=+0.179151976 container create 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, container_name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS)
Nov 28 17:22:04 compute-0 python3[185462]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 28 17:22:04 compute-0 sudo[185454]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:05 compute-0 sudo[185687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdnorqfyovjsyxlqxuwrialsvpppijri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350524.7041855-2788-252477213326766/AnsiballZ_stat.py'
Nov 28 17:22:05 compute-0 sudo[185687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:05 compute-0 python3.9[185689]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:05 compute-0 sudo[185687]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:05 compute-0 sudo[185841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxttpyfjtqgamvijafxklihantnjknu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350525.595555-2806-60941570769782/AnsiballZ_file.py'
Nov 28 17:22:05 compute-0 sudo[185841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:06 compute-0 python3.9[185843]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:06 compute-0 sudo[185841]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:06 compute-0 sudo[185992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwculnacglyrzuqaipuewugvzhmitsfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350526.1796527-2806-137648695077014/AnsiballZ_copy.py'
Nov 28 17:22:06 compute-0 sudo[185992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:06 compute-0 python3.9[185994]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350526.1796527-2806-137648695077014/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:06 compute-0 sudo[185992]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:07 compute-0 sudo[186068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgxposgatfacdcyrdhgvdpsjmapkcokr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350526.1796527-2806-137648695077014/AnsiballZ_systemd.py'
Nov 28 17:22:07 compute-0 sudo[186068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:07 compute-0 python3.9[186070]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:22:07 compute-0 systemd[1]: Reloading.
Nov 28 17:22:07 compute-0 systemd-rc-local-generator[186096]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:22:07 compute-0 systemd-sysv-generator[186100]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:22:07 compute-0 sudo[186068]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:08 compute-0 sudo[186179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ityxpmpocickjjbcuhauqnqkkenvrmnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350526.1796527-2806-137648695077014/AnsiballZ_systemd.py'
Nov 28 17:22:08 compute-0 sudo[186179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:08 compute-0 python3.9[186181]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:22:08 compute-0 systemd[1]: Reloading.
Nov 28 17:22:08 compute-0 podman[186183]: 2025-11-28 17:22:08.554006994 +0000 UTC m=+0.121893161 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:22:08 compute-0 systemd-rc-local-generator[186235]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:22:08 compute-0 systemd-sysv-generator[186240]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:22:08 compute-0 systemd[1]: Starting nova_compute container...
Nov 28 17:22:08 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:22:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:09 compute-0 podman[186247]: 2025-11-28 17:22:09.22547218 +0000 UTC m=+0.397323649 container init 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3)
Nov 28 17:22:09 compute-0 podman[186247]: 2025-11-28 17:22:09.237641451 +0000 UTC m=+0.409492860 container start 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:22:09 compute-0 nova_compute[186262]: + sudo -E kolla_set_configs
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Validating config file
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying service configuration files
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Deleting /etc/ceph
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Creating directory /etc/ceph
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Writing out command to execute
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:09 compute-0 nova_compute[186262]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 17:22:09 compute-0 nova_compute[186262]: ++ cat /run_command
Nov 28 17:22:09 compute-0 podman[186247]: nova_compute
Nov 28 17:22:09 compute-0 nova_compute[186262]: + CMD=nova-compute
Nov 28 17:22:09 compute-0 nova_compute[186262]: + ARGS=
Nov 28 17:22:09 compute-0 nova_compute[186262]: + sudo kolla_copy_cacerts
Nov 28 17:22:09 compute-0 systemd[1]: Started nova_compute container.
Nov 28 17:22:09 compute-0 nova_compute[186262]: Running command: 'nova-compute'
Nov 28 17:22:09 compute-0 nova_compute[186262]: + [[ ! -n '' ]]
Nov 28 17:22:09 compute-0 nova_compute[186262]: + . kolla_extend_start
Nov 28 17:22:09 compute-0 nova_compute[186262]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 17:22:09 compute-0 nova_compute[186262]: + umask 0022
Nov 28 17:22:09 compute-0 nova_compute[186262]: + exec nova-compute
Nov 28 17:22:09 compute-0 sudo[186179]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:09 compute-0 podman[186272]: 2025-11-28 17:22:09.516665787 +0000 UTC m=+0.065259858 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 17:22:10 compute-0 python3.9[186443]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:11 compute-0 python3.9[186593]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.599 186266 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.600 186266 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.600 186266 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.600 186266 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.765 186266 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.783 186266 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:22:11 compute-0 nova_compute[186262]: 2025-11-28 17:22:11.783 186266 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 17:22:12 compute-0 python3.9[186747]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.419 186266 INFO nova.virt.driver [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.540 186266 INFO nova.compute.provider_config [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.556 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.556 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.557 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.558 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.559 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.560 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.561 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.562 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.563 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.564 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.565 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.566 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.567 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.568 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.569 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.570 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.571 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.572 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.573 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.574 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.575 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.576 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.577 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.578 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.579 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.580 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.580 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.580 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.580 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.580 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.581 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.581 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.581 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.581 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.581 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.582 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.583 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.584 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.585 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.586 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.586 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.586 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.586 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.586 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.587 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.588 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.589 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.590 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.591 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.592 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.593 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.594 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.595 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.596 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.597 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.598 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.599 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.600 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.601 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.602 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.603 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.604 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.604 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.605 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.605 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.605 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.606 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.607 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.607 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.607 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.607 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.607 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.608 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.609 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.610 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.611 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.612 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.612 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.612 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.612 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.613 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.614 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.615 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.616 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.617 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.617 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.617 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.617 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.617 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.618 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.619 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.620 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.621 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.622 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.623 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.624 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.625 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.626 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.627 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.628 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.629 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.630 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.631 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.632 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 WARNING oslo_config.cfg [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 17:22:12 compute-0 nova_compute[186262]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 17:22:12 compute-0 nova_compute[186262]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 17:22:12 compute-0 nova_compute[186262]: and ``live_migration_inbound_addr`` respectively.
Nov 28 17:22:12 compute-0 nova_compute[186262]: ).  Its value may be silently ignored in the future.
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.633 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.634 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.635 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.636 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.637 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.638 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.639 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.640 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.641 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.642 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.643 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.644 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.645 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.646 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.647 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.648 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.649 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.650 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.651 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.652 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.653 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.653 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.653 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.653 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.653 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.654 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.655 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.656 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.656 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.656 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.656 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.656 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.657 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.658 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.658 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.658 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.658 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.658 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.659 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.659 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.659 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.659 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.659 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.660 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.661 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.662 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.663 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.664 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.665 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.666 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.667 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.668 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.669 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.670 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.671 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.672 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.673 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.674 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.675 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.676 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.677 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.678 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.679 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.679 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.679 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.679 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.679 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.680 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.681 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.682 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.682 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.682 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.682 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.683 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.683 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.683 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.683 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.683 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.684 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.684 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.684 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.684 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.684 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.685 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.685 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.685 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.685 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.685 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.686 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.687 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.687 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.687 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.687 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.687 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.688 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.689 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.690 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.691 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.692 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.692 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.692 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.692 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.692 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.693 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.694 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.694 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.694 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.694 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.694 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.695 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.696 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.697 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.697 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.697 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.697 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.697 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.698 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.698 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.698 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.698 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.698 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.699 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.700 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.700 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.700 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.700 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.700 186266 DEBUG oslo_service.service [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.702 186266 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.714 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.715 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.715 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.716 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 17:22:12 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 17:22:12 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.809 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff31c89dd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.813 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff31c89dd00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.814 186266 INFO nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Connection event '1' reason 'None'
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.829 186266 WARNING nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 28 17:22:12 compute-0 nova_compute[186262]: 2025-11-28 17:22:12.830 186266 DEBUG nova.virt.libvirt.volume.mount [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 17:22:12 compute-0 sudo[186949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umgonspjlibdormbrmrxsdxgezguzaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350532.4945042-2926-243826998983260/AnsiballZ_podman_container.py'
Nov 28 17:22:12 compute-0 sudo[186949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:13 compute-0 python3.9[186951]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 17:22:13 compute-0 sudo[186949]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:13 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:22:13 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.660 186266 INFO nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]: 
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <host>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <uuid>ea074f67-9ae1-4549-ac73-672b8df6afe1</uuid>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <arch>x86_64</arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model>EPYC-Rome-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <vendor>AMD</vendor>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <microcode version='16777317'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <signature family='23' model='49' stepping='0'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='x2apic'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='tsc-deadline'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='osxsave'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='hypervisor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='tsc_adjust'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='spec-ctrl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='stibp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='arch-capabilities'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='cmp_legacy'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='topoext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='virt-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='lbrv'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='tsc-scale'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='vmcb-clean'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='pause-filter'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='pfthreshold'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='svme-addr-chk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='rdctl-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='mds-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature name='pschange-mc-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <pages unit='KiB' size='4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <pages unit='KiB' size='2048'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <pages unit='KiB' size='1048576'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <power_management>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <suspend_mem/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <suspend_disk/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <suspend_hybrid/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </power_management>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <iommu support='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <migration_features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <live/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <uri_transports>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <uri_transport>tcp</uri_transport>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <uri_transport>rdma</uri_transport>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </uri_transports>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </migration_features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <topology>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <cells num='1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <cell id='0'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <memory unit='KiB'>7864324</memory>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <pages unit='KiB' size='4'>1966081</pages>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <distances>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <sibling id='0' value='10'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           </distances>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           <cpus num='8'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:           </cpus>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         </cell>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </cells>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </topology>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <cache>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </cache>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <secmodel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model>selinux</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <doi>0</doi>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </secmodel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <secmodel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model>dac</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <doi>0</doi>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </secmodel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </host>
Nov 28 17:22:13 compute-0 nova_compute[186262]: 
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <guest>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <os_type>hvm</os_type>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <arch name='i686'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <wordsize>32</wordsize>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <domain type='qemu'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <domain type='kvm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <pae/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <nonpae/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <acpi default='on' toggle='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <apic default='on' toggle='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <cpuselection/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <deviceboot/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <disksnapshot default='on' toggle='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <externalSnapshot/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </guest>
Nov 28 17:22:13 compute-0 nova_compute[186262]: 
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <guest>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <os_type>hvm</os_type>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <arch name='x86_64'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <wordsize>64</wordsize>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <domain type='qemu'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <domain type='kvm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <acpi default='on' toggle='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <apic default='on' toggle='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <cpuselection/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <deviceboot/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <disksnapshot default='on' toggle='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <externalSnapshot/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </guest>
Nov 28 17:22:13 compute-0 nova_compute[186262]: 
Nov 28 17:22:13 compute-0 nova_compute[186262]: </capabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]: 
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.667 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.694 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 17:22:13 compute-0 nova_compute[186262]: <domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <domain>kvm</domain>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <arch>i686</arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <vcpu max='240'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <iothreads supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <os supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='firmware'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <loader supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>rom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pflash</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='readonly'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>yes</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='secure'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </loader>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </os>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='maximumMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <vendor>AMD</vendor>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='succor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='custom' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-128'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-256'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-512'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <memoryBacking supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='sourceType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>anonymous</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>memfd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </memoryBacking>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <disk supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='diskDevice'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>disk</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cdrom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>floppy</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>lun</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ide</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>fdc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>sata</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </disk>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <graphics supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vnc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egl-headless</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </graphics>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <video supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='modelType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vga</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cirrus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>none</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>bochs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ramfb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </video>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hostdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='mode'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>subsystem</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='startupPolicy'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>mandatory</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>requisite</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>optional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='subsysType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pci</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='capsType'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='pciBackend'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hostdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <rng supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>random</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </rng>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <filesystem supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='driverType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>path</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>handle</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtiofs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </filesystem>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <tpm supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-tis</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-crb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emulator</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>external</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendVersion'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>2.0</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </tpm>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <redirdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </redirdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <channel supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </channel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <crypto supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </crypto>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <interface supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>passt</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </interface>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <panic supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>isa</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>hyperv</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </panic>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <console supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>null</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dev</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pipe</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stdio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>udp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tcp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu-vdagent</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </console>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <gic supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <genid supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backup supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <async-teardown supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <ps2 supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sev supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sgx supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hyperv supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='features'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>relaxed</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vapic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>spinlocks</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vpindex</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>runtime</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>synic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stimer</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reset</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vendor_id</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>frequencies</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reenlightenment</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tlbflush</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ipi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>avic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emsr_bitmap</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>xmm_input</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hyperv>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <launchSecurity supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='sectype'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tdx</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </launchSecurity>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]: </domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.700 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 17:22:13 compute-0 nova_compute[186262]: <domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <domain>kvm</domain>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <arch>i686</arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <vcpu max='4096'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <iothreads supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <os supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='firmware'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <loader supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>rom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pflash</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='readonly'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>yes</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='secure'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </loader>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </os>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='maximumMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <vendor>AMD</vendor>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='succor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='custom' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-128'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-256'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-512'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <memoryBacking supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='sourceType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>anonymous</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>memfd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </memoryBacking>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <disk supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='diskDevice'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>disk</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cdrom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>floppy</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>lun</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>fdc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>sata</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </disk>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <graphics supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vnc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egl-headless</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </graphics>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <video supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='modelType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vga</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cirrus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>none</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>bochs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ramfb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </video>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hostdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='mode'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>subsystem</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='startupPolicy'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>mandatory</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>requisite</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>optional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='subsysType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pci</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='capsType'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='pciBackend'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hostdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <rng supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>random</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </rng>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <filesystem supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='driverType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>path</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>handle</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtiofs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </filesystem>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <tpm supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-tis</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-crb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emulator</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>external</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendVersion'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>2.0</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </tpm>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <redirdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </redirdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <channel supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </channel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <crypto supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </crypto>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <interface supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>passt</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </interface>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <panic supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>isa</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>hyperv</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </panic>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <console supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>null</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dev</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pipe</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stdio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>udp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tcp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu-vdagent</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </console>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <gic supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <genid supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backup supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <async-teardown supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <ps2 supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sev supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sgx supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hyperv supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='features'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>relaxed</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vapic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>spinlocks</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vpindex</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>runtime</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>synic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stimer</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reset</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vendor_id</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>frequencies</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reenlightenment</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tlbflush</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ipi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>avic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emsr_bitmap</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>xmm_input</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hyperv>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <launchSecurity supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='sectype'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tdx</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </launchSecurity>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]: </domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.730 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.735 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 17:22:13 compute-0 nova_compute[186262]: <domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <domain>kvm</domain>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <arch>x86_64</arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <vcpu max='240'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <iothreads supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <os supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='firmware'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <loader supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>rom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pflash</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='readonly'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>yes</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='secure'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </loader>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </os>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='maximumMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <vendor>AMD</vendor>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='succor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='custom' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-128'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-256'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-512'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <memoryBacking supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='sourceType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>anonymous</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>memfd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </memoryBacking>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <disk supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='diskDevice'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>disk</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cdrom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>floppy</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>lun</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ide</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>fdc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>sata</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </disk>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <graphics supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vnc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egl-headless</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </graphics>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <video supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='modelType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vga</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cirrus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>none</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>bochs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ramfb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </video>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hostdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='mode'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>subsystem</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='startupPolicy'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>mandatory</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>requisite</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>optional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='subsysType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pci</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='capsType'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='pciBackend'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hostdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <rng supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>random</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </rng>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <filesystem supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='driverType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>path</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>handle</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtiofs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </filesystem>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <tpm supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-tis</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-crb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emulator</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>external</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendVersion'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>2.0</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </tpm>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <redirdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </redirdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <channel supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </channel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <crypto supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </crypto>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <interface supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>passt</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </interface>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <panic supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>isa</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>hyperv</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </panic>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <console supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>null</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dev</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pipe</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stdio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>udp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tcp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu-vdagent</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </console>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <gic supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <genid supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backup supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <async-teardown supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <ps2 supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sev supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sgx supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hyperv supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='features'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>relaxed</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vapic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>spinlocks</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vpindex</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>runtime</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>synic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stimer</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reset</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vendor_id</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>frequencies</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reenlightenment</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tlbflush</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ipi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>avic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emsr_bitmap</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>xmm_input</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hyperv>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <launchSecurity supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='sectype'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tdx</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </launchSecurity>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]: </domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.799 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 17:22:13 compute-0 nova_compute[186262]: <domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <domain>kvm</domain>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <arch>x86_64</arch>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <vcpu max='4096'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <iothreads supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <os supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='firmware'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>efi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <loader supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>rom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pflash</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='readonly'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>yes</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='secure'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>yes</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>no</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </loader>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </os>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='maximumMigratable'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>on</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>off</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <vendor>AMD</vendor>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='succor'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <mode name='custom' supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Denverton-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='auto-ibrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 sudo[187135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfmqospwpvimsdnovdrhvovjvryhwnet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350533.610331-2942-208769726329703/AnsiballZ_systemd.py'
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amd-psfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='stibp-always-on'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 sudo[187135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='EPYC-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-128'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-256'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx10-512'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='prefetchiti'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Haswell-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512er'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512pf'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fma4'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tbm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xop'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='amx-tile'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-bf16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-fp16'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bitalg'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrc'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fzrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='la57'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='taa-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xfd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ifma'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cmpccxadd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fbsdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='fsrs'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ibrs-all'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mcdt-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pbrsb-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='psdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='serialize'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vaes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='hle'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='rtm'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512bw'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512cd'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512dq'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512f'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='avx512vl'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='invpcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pcid'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='pku'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='mpx'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='core-capability'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='split-lock-detect'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='cldemote'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='erms'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='gfni'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdir64b'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='movdiri'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='xsaves'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='athlon-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='core2duo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='coreduo-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='n270-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='ss'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <blockers model='phenom-v1'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnow'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <feature name='3dnowext'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </blockers>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </mode>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <memoryBacking supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <enum name='sourceType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>anonymous</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <value>memfd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </memoryBacking>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <disk supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='diskDevice'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>disk</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cdrom</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>floppy</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>lun</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>fdc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>sata</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </disk>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <graphics supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vnc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egl-headless</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </graphics>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <video supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='modelType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vga</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>cirrus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>none</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>bochs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ramfb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </video>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hostdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='mode'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>subsystem</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='startupPolicy'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>mandatory</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>requisite</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>optional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='subsysType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pci</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>scsi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='capsType'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='pciBackend'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hostdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <rng supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtio-non-transitional</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>random</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>egd</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </rng>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <filesystem supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='driverType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>path</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>handle</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>virtiofs</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </filesystem>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <tpm supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-tis</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tpm-crb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emulator</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>external</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendVersion'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>2.0</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </tpm>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <redirdev supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='bus'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>usb</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </redirdev>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <channel supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </channel>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <crypto supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendModel'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>builtin</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </crypto>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <interface supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='backendType'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>default</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>passt</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </interface>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <panic supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='model'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>isa</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>hyperv</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </panic>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <console supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='type'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>null</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vc</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pty</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dev</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>file</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>pipe</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stdio</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>udp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tcp</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>unix</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>qemu-vdagent</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>dbus</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </console>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </devices>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <features>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <gic supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <genid supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <backup supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <async-teardown supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <ps2 supported='yes'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sev supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <sgx supported='no'/>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <hyperv supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='features'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>relaxed</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vapic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>spinlocks</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vpindex</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>runtime</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>synic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>stimer</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reset</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>vendor_id</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>frequencies</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>reenlightenment</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tlbflush</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>ipi</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>avic</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>emsr_bitmap</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>xmm_input</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </defaults>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </hyperv>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     <launchSecurity supported='yes'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       <enum name='sectype'>
Nov 28 17:22:13 compute-0 nova_compute[186262]:         <value>tdx</value>
Nov 28 17:22:13 compute-0 nova_compute[186262]:       </enum>
Nov 28 17:22:13 compute-0 nova_compute[186262]:     </launchSecurity>
Nov 28 17:22:13 compute-0 nova_compute[186262]:   </features>
Nov 28 17:22:13 compute-0 nova_compute[186262]: </domainCapabilities>
Nov 28 17:22:13 compute-0 nova_compute[186262]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.866 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.866 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.867 186266 DEBUG nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.867 186266 INFO nova.virt.libvirt.host [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Secure Boot support detected
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.869 186266 INFO nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.869 186266 INFO nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.879 186266 DEBUG nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 28 17:22:13 compute-0 nova_compute[186262]:   <model>Nehalem</model>
Nov 28 17:22:13 compute-0 nova_compute[186262]: </cpu>
Nov 28 17:22:13 compute-0 nova_compute[186262]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.882 186266 DEBUG nova.virt.libvirt.driver [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.903 186266 INFO nova.virt.node [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Determined node identity 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from /var/lib/nova/compute_id
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.921 186266 WARNING nova.compute.manager [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Compute nodes ['2d742abf-eadd-46e1-bc0a-5ea4c6acfad5'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 28 17:22:13 compute-0 nova_compute[186262]: 2025-11-28 17:22:13.952 186266 INFO nova.compute.manager [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.076 186266 WARNING nova.compute.manager [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.077 186266 DEBUG oslo_concurrency.lockutils [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.077 186266 DEBUG oslo_concurrency.lockutils [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.077 186266 DEBUG oslo_concurrency.lockutils [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.077 186266 DEBUG nova.compute.resource_tracker [None req-23e1a686-b401-4ea3-9e9c-34be63e91e12 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:22:14 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 17:22:14 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 28 17:22:14 compute-0 python3.9[187137]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:22:14 compute-0 systemd[1]: Stopping nova_compute container...
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.362 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.363 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:22:14 compute-0 nova_compute[186262]: 2025-11-28 17:22:14.363 186266 DEBUG oslo_concurrency.lockutils [None req-90a38652-6d4b-4a80-b136-895ff0dc7f07 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:22:14 compute-0 virtqemud[186845]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 17:22:14 compute-0 virtqemud[186845]: hostname: compute-0
Nov 28 17:22:14 compute-0 virtqemud[186845]: End of file while reading data: Input/output error
Nov 28 17:22:14 compute-0 systemd[1]: libpod-62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0.scope: Deactivated successfully.
Nov 28 17:22:14 compute-0 systemd[1]: libpod-62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0.scope: Consumed 3.535s CPU time.
Nov 28 17:22:14 compute-0 podman[187164]: 2025-11-28 17:22:14.855315888 +0000 UTC m=+0.545614535 container died 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 17:22:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0-userdata-shm.mount: Deactivated successfully.
Nov 28 17:22:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607-merged.mount: Deactivated successfully.
Nov 28 17:22:14 compute-0 podman[187164]: 2025-11-28 17:22:14.938182041 +0000 UTC m=+0.628480648 container cleanup 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute)
Nov 28 17:22:14 compute-0 podman[187164]: nova_compute
Nov 28 17:22:15 compute-0 podman[187195]: nova_compute
Nov 28 17:22:15 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 17:22:15 compute-0 systemd[1]: Stopped nova_compute container.
Nov 28 17:22:15 compute-0 systemd[1]: Starting nova_compute container...
Nov 28 17:22:15 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45490cca88ccb6ff8bc5d9293727af1f3843372d27196562a0d6aa064ffbd607/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:15 compute-0 podman[187208]: 2025-11-28 17:22:15.174985531 +0000 UTC m=+0.127443411 container init 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:22:15 compute-0 podman[187208]: 2025-11-28 17:22:15.181931713 +0000 UTC m=+0.134389563 container start 62f503b283c9d73ebceb73c7835b48325ee3e928b86c844114797933982290b0 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:22:15 compute-0 nova_compute[187223]: + sudo -E kolla_set_configs
Nov 28 17:22:15 compute-0 podman[187208]: nova_compute
Nov 28 17:22:15 compute-0 systemd[1]: Started nova_compute container.
Nov 28 17:22:15 compute-0 sudo[187135]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Validating config file
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying service configuration files
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /etc/ceph
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Creating directory /etc/ceph
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Writing out command to execute
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:15 compute-0 nova_compute[187223]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 17:22:15 compute-0 nova_compute[187223]: ++ cat /run_command
Nov 28 17:22:15 compute-0 nova_compute[187223]: + CMD=nova-compute
Nov 28 17:22:15 compute-0 nova_compute[187223]: + ARGS=
Nov 28 17:22:15 compute-0 nova_compute[187223]: + sudo kolla_copy_cacerts
Nov 28 17:22:15 compute-0 nova_compute[187223]: + [[ ! -n '' ]]
Nov 28 17:22:15 compute-0 nova_compute[187223]: + . kolla_extend_start
Nov 28 17:22:15 compute-0 nova_compute[187223]: Running command: 'nova-compute'
Nov 28 17:22:15 compute-0 nova_compute[187223]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 17:22:15 compute-0 nova_compute[187223]: + umask 0022
Nov 28 17:22:15 compute-0 nova_compute[187223]: + exec nova-compute
Nov 28 17:22:15 compute-0 sudo[187384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgmnocupcammuqqnoitqoecnwkbgsnrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350535.4676359-2960-117767364974863/AnsiballZ_podman_container.py'
Nov 28 17:22:15 compute-0 sudo[187384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:16 compute-0 python3.9[187386]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 17:22:16 compute-0 systemd[1]: Started libpod-conmon-5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78.scope.
Nov 28 17:22:16 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:22:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299f19958adab056e286f77977d1b9a29fee6862d402ef5123fd12e63e38e856/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299f19958adab056e286f77977d1b9a29fee6862d402ef5123fd12e63e38e856/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299f19958adab056e286f77977d1b9a29fee6862d402ef5123fd12e63e38e856/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 17:22:16 compute-0 podman[187411]: 2025-11-28 17:22:16.293027345 +0000 UTC m=+0.155989521 container init 5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:22:16 compute-0 podman[187411]: 2025-11-28 17:22:16.301291486 +0000 UTC m=+0.164253642 container start 5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:22:16 compute-0 python3.9[187386]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 17:22:16 compute-0 nova_compute_init[187433]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 17:22:16 compute-0 systemd[1]: libpod-5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78.scope: Deactivated successfully.
Nov 28 17:22:16 compute-0 podman[187447]: 2025-11-28 17:22:16.40617444 +0000 UTC m=+0.028512759 container died 5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible)
Nov 28 17:22:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78-userdata-shm.mount: Deactivated successfully.
Nov 28 17:22:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-299f19958adab056e286f77977d1b9a29fee6862d402ef5123fd12e63e38e856-merged.mount: Deactivated successfully.
Nov 28 17:22:16 compute-0 podman[187447]: 2025-11-28 17:22:16.436643308 +0000 UTC m=+0.058981607 container cleanup 5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 17:22:16 compute-0 systemd[1]: libpod-conmon-5c6330e4d35e750c3bcb52706d209d716479c83ee3aada6b147eaffa9ed9bc78.scope: Deactivated successfully.
Nov 28 17:22:16 compute-0 sudo[187384]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:17 compute-0 sshd-session[159085]: Connection closed by 192.168.122.30 port 42910
Nov 28 17:22:17 compute-0 sshd-session[159082]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:22:17 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 28 17:22:17 compute-0 systemd[1]: session-24.scope: Consumed 2min 522ms CPU time.
Nov 28 17:22:17 compute-0 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Nov 28 17:22:17 compute-0 systemd-logind[788]: Removed session 24.
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.493 187227 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.494 187227 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.494 187227 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.494 187227 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.657 187227 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.672 187227 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:22:17 compute-0 nova_compute[187223]: 2025-11-28 17:22:17.673 187227 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.084 187227 INFO nova.virt.driver [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.203 187227 INFO nova.compute.provider_config [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.219 187227 DEBUG oslo_concurrency.lockutils [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.219 187227 DEBUG oslo_concurrency.lockutils [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_concurrency.lockutils [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.220 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.221 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.222 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.223 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.224 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.224 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.224 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.224 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.224 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.225 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.226 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.226 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.226 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.226 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.226 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.227 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.228 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.229 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.230 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.231 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.232 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.233 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.234 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.235 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.236 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.236 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.236 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.236 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.236 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.237 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.238 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.239 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.239 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.239 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.239 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.239 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.240 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.241 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.241 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.241 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.241 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.241 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.242 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.242 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.242 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.242 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.242 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.243 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.243 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.243 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.243 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.244 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.244 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.244 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.244 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.245 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.245 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.245 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.245 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.246 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.246 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.246 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.246 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.246 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.247 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.247 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.247 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.247 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.248 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.248 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.248 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.248 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.248 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.249 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.249 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.249 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.249 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.250 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.250 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.250 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.250 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.250 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.251 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.251 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.251 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.251 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.251 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.252 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.252 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.252 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.252 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.253 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.253 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.253 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.253 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.254 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.254 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.254 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.254 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.254 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.255 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.255 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.255 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.255 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.256 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.256 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.256 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.256 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.256 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.257 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.257 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.257 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.257 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.257 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.258 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.258 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.258 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.258 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.258 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.259 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.259 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.259 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.259 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.260 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.260 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.260 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.260 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.261 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.261 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.261 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.261 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.261 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.262 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.262 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.262 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.262 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.263 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.263 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.263 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.263 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.263 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.264 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.264 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.264 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.264 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.265 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.265 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.265 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.265 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.265 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.266 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.266 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.266 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.266 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.267 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.267 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.267 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.267 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.268 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.268 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.268 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.268 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.268 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.269 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.269 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.269 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.269 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.269 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.270 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.270 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.270 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.270 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.271 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.271 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.271 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.271 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.272 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.272 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.272 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.272 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.272 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.273 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.273 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.273 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.273 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.274 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.274 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.274 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.274 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.274 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.275 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.275 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.275 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.275 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.276 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.276 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.276 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.276 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.276 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.277 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.277 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.277 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.277 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.278 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.278 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.278 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.278 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.278 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.279 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.279 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.279 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.279 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.280 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.280 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.280 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.280 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.280 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.281 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.281 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.281 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.281 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.281 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.282 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.282 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.282 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.282 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.282 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.283 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.283 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.283 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.283 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.284 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.284 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.284 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.284 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.284 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.285 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.285 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.285 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.285 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.286 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.286 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.286 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.286 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.286 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.287 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.287 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.287 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.287 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.288 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.288 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.288 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.288 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.288 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.289 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.289 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.289 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.289 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.290 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.290 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.290 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.290 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.290 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.291 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.291 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.291 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.292 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.292 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.292 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.292 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.293 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.293 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.293 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.293 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.293 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.294 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.294 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.294 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.294 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.295 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.295 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.295 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.295 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.295 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.296 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.296 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.296 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.296 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.296 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.297 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.297 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.297 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.297 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.298 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.298 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.298 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.298 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.298 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.299 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.299 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.299 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.299 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.300 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.300 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.300 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.300 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.301 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.301 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.301 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.301 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.301 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.302 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.302 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.302 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.302 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.302 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.303 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.303 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.303 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.303 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.304 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.304 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.304 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.304 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.304 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.305 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.305 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.305 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.305 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.306 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.306 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.306 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.306 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.306 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.307 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.307 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.307 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.307 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.308 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.308 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.308 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.308 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.308 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.309 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.309 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.309 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.309 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.310 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.310 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.310 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.310 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.310 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.311 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.311 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.311 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.311 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.312 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.312 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.312 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.312 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.312 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.313 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.313 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.313 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.313 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.314 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.314 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.314 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.314 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.315 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.315 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.315 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.315 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.315 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.316 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.316 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.316 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.316 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.317 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.317 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.317 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.317 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.317 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.318 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.318 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.318 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.318 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.318 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.319 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.319 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.319 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.319 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.319 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.320 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.320 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.320 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.320 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.320 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.321 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.321 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.321 187227 WARNING oslo_config.cfg [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 17:22:18 compute-0 nova_compute[187223]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 17:22:18 compute-0 nova_compute[187223]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 17:22:18 compute-0 nova_compute[187223]: and ``live_migration_inbound_addr`` respectively.
Nov 28 17:22:18 compute-0 nova_compute[187223]: ).  Its value may be silently ignored in the future.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.321 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.322 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.322 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.322 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.322 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.323 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.323 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.323 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.323 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.323 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.324 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.324 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.324 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.324 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.325 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.325 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.325 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.325 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.325 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.326 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.326 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.326 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.326 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.326 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.327 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.327 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.327 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.327 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.328 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.328 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.328 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.328 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.329 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.329 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.329 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.329 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.329 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.330 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.330 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.330 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.330 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.330 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.331 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.331 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.331 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.331 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.331 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.332 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.333 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.333 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.333 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.333 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.333 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.334 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.334 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.334 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.334 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.335 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.335 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.335 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.335 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.335 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.336 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.336 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.336 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.336 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.336 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.337 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.337 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.337 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.337 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.338 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.338 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.338 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.338 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.339 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.339 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.339 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.339 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.339 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.340 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.340 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.340 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.340 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.340 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.341 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.341 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.341 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.341 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.341 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.342 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.342 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.342 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.342 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.342 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.343 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.343 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.343 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.343 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.344 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.344 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.344 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.344 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.345 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.345 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.345 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.345 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.346 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.346 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.346 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.346 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.346 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.347 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.347 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.347 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.347 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.347 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.348 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.348 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.348 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.349 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.350 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.351 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.351 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.351 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.351 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.351 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.352 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.353 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.354 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.354 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.354 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.354 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.354 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.355 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.355 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.355 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.355 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.356 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.357 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.358 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.359 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.360 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.361 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.362 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.363 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.364 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.365 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.366 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.367 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.368 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.368 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.368 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.368 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.368 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.369 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.370 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.371 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.372 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.373 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.374 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.375 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.376 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.377 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.378 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.379 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.380 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.381 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.382 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.383 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.384 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.385 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.386 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.387 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.388 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.389 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.390 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.391 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.392 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.393 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.394 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.394 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.394 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.394 187227 DEBUG oslo_service.service [None req-aa1eeeff-aebf-4ec0-8824-4ad53d3d6fb9 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.395 187227 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.408 187227 INFO nova.virt.node [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Determined node identity 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from /var/lib/nova/compute_id
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.409 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.409 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.410 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.410 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.424 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2b47bc9430> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.426 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2b47bc9430> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.426 187227 INFO nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Connection event '1' reason 'None'
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.433 187227 INFO nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]: 
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <host>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <uuid>ea074f67-9ae1-4549-ac73-672b8df6afe1</uuid>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <arch>x86_64</arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model>EPYC-Rome-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <vendor>AMD</vendor>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <microcode version='16777317'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <signature family='23' model='49' stepping='0'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='x2apic'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='tsc-deadline'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='osxsave'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='hypervisor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='tsc_adjust'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='spec-ctrl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='stibp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='arch-capabilities'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='cmp_legacy'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='topoext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='virt-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='lbrv'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='tsc-scale'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='vmcb-clean'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='pause-filter'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='pfthreshold'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='svme-addr-chk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='rdctl-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='skip-l1dfl-vmentry'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='mds-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature name='pschange-mc-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <pages unit='KiB' size='4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <pages unit='KiB' size='2048'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <pages unit='KiB' size='1048576'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <power_management>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <suspend_mem/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <suspend_disk/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <suspend_hybrid/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </power_management>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <iommu support='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <migration_features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <live/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <uri_transports>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <uri_transport>tcp</uri_transport>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <uri_transport>rdma</uri_transport>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </uri_transports>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </migration_features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <topology>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <cells num='1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <cell id='0'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <memory unit='KiB'>7864324</memory>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <pages unit='KiB' size='4'>1966081</pages>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <pages unit='KiB' size='2048'>0</pages>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <distances>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <sibling id='0' value='10'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           </distances>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           <cpus num='8'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:           </cpus>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         </cell>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </cells>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </topology>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <cache>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </cache>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <secmodel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model>selinux</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <doi>0</doi>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </secmodel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <secmodel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model>dac</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <doi>0</doi>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </secmodel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </host>
Nov 28 17:22:18 compute-0 nova_compute[187223]: 
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <guest>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <os_type>hvm</os_type>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <arch name='i686'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <wordsize>32</wordsize>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <domain type='qemu'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <domain type='kvm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <pae/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <nonpae/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <acpi default='on' toggle='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <apic default='on' toggle='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <cpuselection/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <deviceboot/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <disksnapshot default='on' toggle='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <externalSnapshot/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </guest>
Nov 28 17:22:18 compute-0 nova_compute[187223]: 
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <guest>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <os_type>hvm</os_type>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <arch name='x86_64'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <wordsize>64</wordsize>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <domain type='qemu'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <domain type='kvm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <acpi default='on' toggle='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <apic default='on' toggle='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <cpuselection/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <deviceboot/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <disksnapshot default='on' toggle='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <externalSnapshot/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </guest>
Nov 28 17:22:18 compute-0 nova_compute[187223]: 
Nov 28 17:22:18 compute-0 nova_compute[187223]: </capabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]: 
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.439 187227 DEBUG nova.virt.libvirt.volume.mount [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.442 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.451 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 17:22:18 compute-0 nova_compute[187223]: <domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <domain>kvm</domain>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <arch>i686</arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <vcpu max='4096'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <iothreads supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <os supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='firmware'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <loader supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>rom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pflash</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='readonly'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>yes</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='secure'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </loader>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </os>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='maximumMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <vendor>AMD</vendor>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='succor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='custom' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-128'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-256'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-512'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <memoryBacking supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='sourceType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>anonymous</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>memfd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </memoryBacking>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <disk supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='diskDevice'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>disk</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cdrom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>floppy</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>lun</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>fdc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>sata</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <graphics supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vnc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egl-headless</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </graphics>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <video supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='modelType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vga</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cirrus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>none</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>bochs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ramfb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </video>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hostdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='mode'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>subsystem</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='startupPolicy'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>mandatory</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>requisite</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>optional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='subsysType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pci</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='capsType'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='pciBackend'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hostdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <rng supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>random</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <filesystem supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='driverType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>path</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>handle</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtiofs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </filesystem>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <tpm supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-tis</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-crb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emulator</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>external</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendVersion'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>2.0</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </tpm>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <redirdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </redirdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <channel supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </channel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <crypto supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </crypto>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <interface supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>passt</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <panic supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>isa</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>hyperv</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </panic>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <console supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>null</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dev</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pipe</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stdio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>udp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tcp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu-vdagent</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </console>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <gic supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <genid supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backup supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <async-teardown supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <ps2 supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sev supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sgx supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hyperv supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='features'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>relaxed</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vapic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>spinlocks</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vpindex</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>runtime</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>synic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stimer</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reset</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vendor_id</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>frequencies</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reenlightenment</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tlbflush</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ipi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>avic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emsr_bitmap</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>xmm_input</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hyperv>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <launchSecurity supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='sectype'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tdx</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </launchSecurity>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]: </domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.456 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 17:22:18 compute-0 nova_compute[187223]: <domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <domain>kvm</domain>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <arch>i686</arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <vcpu max='240'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <iothreads supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <os supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='firmware'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <loader supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>rom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pflash</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='readonly'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>yes</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='secure'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </loader>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </os>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='maximumMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <vendor>AMD</vendor>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='succor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='custom' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-128'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-256'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-512'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <memoryBacking supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='sourceType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>anonymous</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>memfd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </memoryBacking>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <disk supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='diskDevice'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>disk</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cdrom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>floppy</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>lun</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ide</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>fdc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>sata</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <graphics supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vnc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egl-headless</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </graphics>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <video supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='modelType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vga</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cirrus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>none</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>bochs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ramfb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </video>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hostdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='mode'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>subsystem</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='startupPolicy'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>mandatory</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>requisite</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>optional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='subsysType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pci</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='capsType'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='pciBackend'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hostdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <rng supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>random</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <filesystem supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='driverType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>path</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>handle</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtiofs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </filesystem>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <tpm supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-tis</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-crb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emulator</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>external</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendVersion'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>2.0</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </tpm>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <redirdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </redirdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <channel supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </channel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <crypto supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </crypto>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <interface supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>passt</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <panic supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>isa</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>hyperv</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </panic>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <console supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>null</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dev</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pipe</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stdio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>udp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tcp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu-vdagent</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </console>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <gic supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <genid supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backup supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <async-teardown supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <ps2 supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sev supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sgx supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hyperv supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='features'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>relaxed</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vapic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>spinlocks</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vpindex</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>runtime</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>synic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stimer</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reset</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vendor_id</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>frequencies</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reenlightenment</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tlbflush</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ipi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>avic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emsr_bitmap</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>xmm_input</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hyperv>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <launchSecurity supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='sectype'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tdx</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </launchSecurity>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]: </domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.483 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.487 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 17:22:18 compute-0 nova_compute[187223]: <domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <domain>kvm</domain>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <arch>x86_64</arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <vcpu max='4096'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <iothreads supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <os supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='firmware'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>efi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <loader supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>rom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pflash</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='readonly'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>yes</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='secure'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>yes</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </loader>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </os>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='maximumMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <vendor>AMD</vendor>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='succor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='custom' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-128'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-256'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-512'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <memoryBacking supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='sourceType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>anonymous</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>memfd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </memoryBacking>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <disk supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='diskDevice'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>disk</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cdrom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>floppy</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>lun</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>fdc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>sata</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <graphics supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vnc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egl-headless</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </graphics>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <video supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='modelType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vga</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cirrus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>none</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>bochs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ramfb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </video>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hostdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='mode'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>subsystem</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='startupPolicy'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>mandatory</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>requisite</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>optional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='subsysType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pci</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='capsType'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='pciBackend'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hostdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <rng supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>random</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <filesystem supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='driverType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>path</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>handle</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtiofs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </filesystem>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <tpm supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-tis</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-crb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emulator</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>external</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendVersion'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>2.0</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </tpm>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <redirdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </redirdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <channel supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </channel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <crypto supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </crypto>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <interface supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>passt</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <panic supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>isa</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>hyperv</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </panic>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <console supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>null</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dev</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pipe</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stdio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>udp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tcp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu-vdagent</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </console>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <gic supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <genid supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backup supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <async-teardown supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <ps2 supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sev supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sgx supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hyperv supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='features'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>relaxed</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vapic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>spinlocks</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vpindex</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>runtime</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>synic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stimer</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reset</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vendor_id</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>frequencies</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reenlightenment</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tlbflush</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ipi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>avic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emsr_bitmap</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>xmm_input</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hyperv>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <launchSecurity supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='sectype'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tdx</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </launchSecurity>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]: </domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.545 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 17:22:18 compute-0 nova_compute[187223]: <domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <path>/usr/libexec/qemu-kvm</path>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <domain>kvm</domain>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <arch>x86_64</arch>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <vcpu max='240'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <iothreads supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <os supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='firmware'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <loader supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>rom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pflash</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='readonly'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>yes</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='secure'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>no</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </loader>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </os>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-passthrough' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='hostPassthroughMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='maximum' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='maximumMigratable'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>on</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>off</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='host-model' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <vendor>AMD</vendor>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='x2apic'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-deadline'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='hypervisor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc_adjust'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='spec-ctrl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='stibp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='cmp_legacy'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='overflow-recov'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='succor'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='amd-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='virt-ssbd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lbrv'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='tsc-scale'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='vmcb-clean'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='flushbyasid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pause-filter'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='pfthreshold'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='svme-addr-chk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <feature policy='disable' name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <mode name='custom' supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Broadwell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cascadelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Cooperlake-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Denverton-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Dhyana-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Genoa-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='auto-ibrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Milan-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amd-psfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='no-nested-data-bp'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='null-sel-clr-base'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='stibp-always-on'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-Rome-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='EPYC-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='GraniteRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-128'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-256'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx10-512'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='prefetchiti'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Haswell-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-noTSX'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v6'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Icelake-Server-v7'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='IvyBridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='KnightsMill-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4fmaps'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-4vnniw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512er'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512pf'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G4-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Opteron_G5-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fma4'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tbm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xop'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SapphireRapids-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='amx-tile'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-bf16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-fp16'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512-vpopcntdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bitalg'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vbmi2'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrc'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fzrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='la57'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='taa-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='tsx-ldtrk'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xfd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='SierraForest-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ifma'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-ne-convert'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx-vnni-int8'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='bus-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cmpccxadd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fbsdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='fsrs'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ibrs-all'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mcdt-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pbrsb-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='psdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='sbdr-ssdp-no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='serialize'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vaes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='vpclmulqdq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Client-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='hle'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='rtm'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Skylake-Server-v5'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512bw'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512cd'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512dq'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512f'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='avx512vl'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='invpcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pcid'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='pku'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='mpx'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v2'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v3'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='core-capability'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='split-lock-detect'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='Snowridge-v4'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='cldemote'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='erms'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='gfni'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdir64b'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='movdiri'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='xsaves'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='athlon-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='core2duo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='coreduo-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='n270-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='ss'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <blockers model='phenom-v1'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnow'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <feature name='3dnowext'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </blockers>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </mode>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <memoryBacking supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <enum name='sourceType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>anonymous</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <value>memfd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </memoryBacking>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <disk supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='diskDevice'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>disk</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cdrom</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>floppy</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>lun</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ide</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>fdc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>sata</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <graphics supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vnc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egl-headless</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </graphics>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <video supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='modelType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vga</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>cirrus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>none</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>bochs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ramfb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </video>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hostdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='mode'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>subsystem</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='startupPolicy'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>mandatory</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>requisite</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>optional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='subsysType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pci</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>scsi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='capsType'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='pciBackend'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hostdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <rng supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtio-non-transitional</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>random</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>egd</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <filesystem supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='driverType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>path</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>handle</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>virtiofs</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </filesystem>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <tpm supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-tis</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tpm-crb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emulator</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>external</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendVersion'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>2.0</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </tpm>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <redirdev supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='bus'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>usb</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </redirdev>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <channel supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </channel>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <crypto supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendModel'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>builtin</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </crypto>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <interface supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='backendType'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>default</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>passt</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <panic supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='model'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>isa</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>hyperv</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </panic>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <console supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='type'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>null</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vc</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pty</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dev</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>file</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>pipe</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stdio</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>udp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tcp</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>unix</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>qemu-vdagent</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>dbus</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </console>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <features>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <gic supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <vmcoreinfo supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <genid supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backingStoreInput supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <backup supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <async-teardown supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <ps2 supported='yes'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sev supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <sgx supported='no'/>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <hyperv supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='features'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>relaxed</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vapic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>spinlocks</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vpindex</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>runtime</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>synic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>stimer</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reset</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>vendor_id</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>frequencies</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>reenlightenment</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tlbflush</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>ipi</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>avic</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>emsr_bitmap</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>xmm_input</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <spinlocks>4095</spinlocks>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <stimer_direct>on</stimer_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_direct>on</tlbflush_direct>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <tlbflush_extended>on</tlbflush_extended>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </defaults>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </hyperv>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     <launchSecurity supported='yes'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       <enum name='sectype'>
Nov 28 17:22:18 compute-0 nova_compute[187223]:         <value>tdx</value>
Nov 28 17:22:18 compute-0 nova_compute[187223]:       </enum>
Nov 28 17:22:18 compute-0 nova_compute[187223]:     </launchSecurity>
Nov 28 17:22:18 compute-0 nova_compute[187223]:   </features>
Nov 28 17:22:18 compute-0 nova_compute[187223]: </domainCapabilities>
Nov 28 17:22:18 compute-0 nova_compute[187223]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.612 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.612 187227 INFO nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Secure Boot support detected
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.615 187227 INFO nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.615 187227 INFO nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.623 187227 DEBUG nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] cpu compare xml: <cpu match="exact">
Nov 28 17:22:18 compute-0 nova_compute[187223]:   <model>Nehalem</model>
Nov 28 17:22:18 compute-0 nova_compute[187223]: </cpu>
Nov 28 17:22:18 compute-0 nova_compute[187223]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.625 187227 DEBUG nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.724 187227 INFO nova.virt.node [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Determined node identity 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from /var/lib/nova/compute_id
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.749 187227 WARNING nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Compute nodes ['2d742abf-eadd-46e1-bc0a-5ea4c6acfad5'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.773 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.787 187227 WARNING nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.787 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.787 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.787 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.788 187227 DEBUG nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.919 187227 WARNING nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.920 187227 DEBUG nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6199MB free_disk=73.54408264160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.920 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.920 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.936 187227 WARNING nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] No compute node record for compute-0.ctlplane.example.com:2d742abf-eadd-46e1-bc0a-5ea4c6acfad5: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 could not be found.
Nov 28 17:22:18 compute-0 nova_compute[187223]: 2025-11-28 17:22:18.960 187227 INFO nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.006 187227 DEBUG nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.007 187227 DEBUG nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.110 187227 INFO nova.scheduler.client.report [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [req-5d2aa22f-4c26-4f8b-82be-101a604d02f5] Created resource provider record via placement API for resource provider with UUID 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 and name compute-0.ctlplane.example.com.
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.144 187227 DEBUG nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 17:22:19 compute-0 nova_compute[187223]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.145 187227 INFO nova.virt.libvirt.host [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] kernel doesn't support AMD SEV
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.146 187227 DEBUG nova.compute.provider_tree [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.146 187227 DEBUG nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.148 187227 DEBUG nova.virt.libvirt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Libvirt baseline CPU <cpu>
Nov 28 17:22:19 compute-0 nova_compute[187223]:   <arch>x86_64</arch>
Nov 28 17:22:19 compute-0 nova_compute[187223]:   <model>Nehalem</model>
Nov 28 17:22:19 compute-0 nova_compute[187223]:   <vendor>AMD</vendor>
Nov 28 17:22:19 compute-0 nova_compute[187223]:   <topology sockets="8" cores="1" threads="1"/>
Nov 28 17:22:19 compute-0 nova_compute[187223]: </cpu>
Nov 28 17:22:19 compute-0 nova_compute[187223]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.208 187227 DEBUG nova.scheduler.client.report [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Updated inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.208 187227 DEBUG nova.compute.provider_tree [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.209 187227 DEBUG nova.compute.provider_tree [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.350 187227 DEBUG nova.compute.provider_tree [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.378 187227 DEBUG nova.compute.resource_tracker [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.378 187227 DEBUG oslo_concurrency.lockutils [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.378 187227 DEBUG nova.service [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.474 187227 DEBUG nova.service [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 28 17:22:19 compute-0 nova_compute[187223]: 2025-11-28 17:22:19.475 187227 DEBUG nova.servicegroup.drivers.db [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 28 17:22:23 compute-0 sshd-session[187519]: Accepted publickey for zuul from 192.168.122.30 port 35150 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 17:22:23 compute-0 systemd-logind[788]: New session 26 of user zuul.
Nov 28 17:22:23 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 28 17:22:23 compute-0 sshd-session[187519]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 17:22:24 compute-0 python3.9[187672]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 17:22:25 compute-0 sudo[187826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thamlhsmmlbhpuxjjrczawqkyqqswipt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350544.8752584-56-124412027574145/AnsiballZ_systemd_service.py'
Nov 28 17:22:25 compute-0 sudo[187826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:25 compute-0 python3.9[187828]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:22:26 compute-0 systemd[1]: Reloading.
Nov 28 17:22:26 compute-0 systemd-rc-local-generator[187861]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:22:26 compute-0 systemd-sysv-generator[187864]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:22:26 compute-0 sudo[187826]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:27 compute-0 python3.9[188015]: ansible-ansible.builtin.service_facts Invoked
Nov 28 17:22:27 compute-0 network[188032]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 17:22:27 compute-0 network[188033]: 'network-scripts' will be removed from distribution in near future.
Nov 28 17:22:27 compute-0 network[188034]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 17:22:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:22:27.664 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:22:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:22:27.666 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:22:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:22:27.666 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:22:32 compute-0 sudo[188306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njwkcaadhspxhxctdkanshzyczsahspf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350552.0139525-94-132818005630171/AnsiballZ_systemd_service.py'
Nov 28 17:22:32 compute-0 sudo[188306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:32 compute-0 python3.9[188308]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:22:32 compute-0 sudo[188306]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:33 compute-0 sudo[188459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxnuodrdanjlxwehmwvuhjqvyzqqddaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350553.000309-114-248390605976163/AnsiballZ_file.py'
Nov 28 17:22:33 compute-0 sudo[188459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:33 compute-0 python3.9[188461]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:33 compute-0 sudo[188459]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:33 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:22:33 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 17:22:34 compute-0 sudo[188628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veottdnmujakogwgqfkolbubmafphnat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350553.8657014-130-150581080138024/AnsiballZ_file.py'
Nov 28 17:22:34 compute-0 sudo[188628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:34 compute-0 podman[188586]: 2025-11-28 17:22:34.208254978 +0000 UTC m=+0.092194419 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:22:34 compute-0 python3.9[188634]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:34 compute-0 sudo[188628]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:35 compute-0 sudo[188784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hctznfgzkoruxdjyuacwzzvfpqgehmpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350554.6843662-148-114841834092273/AnsiballZ_command.py'
Nov 28 17:22:35 compute-0 sudo[188784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:35 compute-0 python3.9[188786]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:22:35 compute-0 sudo[188784]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:36 compute-0 python3.9[188938]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:22:36 compute-0 sudo[189088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pthrjnnkbugodqxmdqfixhjrxlhzyzrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350556.5837154-184-36739955380482/AnsiballZ_systemd_service.py'
Nov 28 17:22:36 compute-0 sudo[189088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:37 compute-0 python3.9[189090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:22:37 compute-0 systemd[1]: Reloading.
Nov 28 17:22:37 compute-0 systemd-rc-local-generator[189117]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:22:37 compute-0 systemd-sysv-generator[189122]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:22:37 compute-0 sudo[189088]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:38 compute-0 sudo[189275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjzngbfyiogdfainfdbqahqypscsefiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350557.750084-200-4523553764985/AnsiballZ_command.py'
Nov 28 17:22:38 compute-0 sudo[189275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:38 compute-0 python3.9[189277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:22:38 compute-0 sudo[189275]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:38 compute-0 sudo[189428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyjsdvpzhthaginxvggqvszecvsrhdml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350558.5008292-218-270651311995224/AnsiballZ_file.py'
Nov 28 17:22:38 compute-0 sudo[189428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:38 compute-0 python3.9[189430]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:22:39 compute-0 sudo[189428]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:39 compute-0 podman[189431]: 2025-11-28 17:22:39.167154036 +0000 UTC m=+0.147887624 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 17:22:39 compute-0 podman[189581]: 2025-11-28 17:22:39.703980573 +0000 UTC m=+0.062431812 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 17:22:39 compute-0 python3.9[189625]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:40 compute-0 python3.9[189780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:41 compute-0 python3.9[189902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350560.1031413-250-138526654834896/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:22:42 compute-0 sudo[190052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxisamkuvorgfylcubufzcgqtcdcfywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350561.5299313-280-95025923926136/AnsiballZ_group.py'
Nov 28 17:22:42 compute-0 sudo[190052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:42 compute-0 python3.9[190054]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 28 17:22:42 compute-0 sudo[190052]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:43 compute-0 sudo[190204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tijwjxcglbetaxvymowitorbgbzyeyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350562.6341238-302-14737140627373/AnsiballZ_getent.py'
Nov 28 17:22:43 compute-0 sudo[190204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:43 compute-0 python3.9[190206]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 28 17:22:43 compute-0 sudo[190204]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:43 compute-0 sudo[190357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdqdvoyqbkgwkzmrmbocoqzuakbkjeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350563.6250951-318-3044478239301/AnsiballZ_group.py'
Nov 28 17:22:43 compute-0 sudo[190357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:44 compute-0 python3.9[190359]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 17:22:44 compute-0 groupadd[190360]: group added to /etc/group: name=ceilometer, GID=42405
Nov 28 17:22:44 compute-0 groupadd[190360]: group added to /etc/gshadow: name=ceilometer
Nov 28 17:22:44 compute-0 groupadd[190360]: new group: name=ceilometer, GID=42405
Nov 28 17:22:44 compute-0 sudo[190357]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:45 compute-0 sudo[190515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzygyfuvjislbejonhxxllysrptaaope ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350564.456044-334-229258484574163/AnsiballZ_user.py'
Nov 28 17:22:45 compute-0 sudo[190515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:22:45 compute-0 python3.9[190517]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 17:22:45 compute-0 useradd[190519]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 28 17:22:45 compute-0 useradd[190519]: add 'ceilometer' to group 'libvirt'
Nov 28 17:22:45 compute-0 useradd[190519]: add 'ceilometer' to shadow group 'libvirt'
Nov 28 17:22:45 compute-0 sudo[190515]: pam_unix(sudo:session): session closed for user root
Nov 28 17:22:46 compute-0 python3.9[190675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:47 compute-0 python3.9[190796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764350566.1745956-386-4229073531974/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:47 compute-0 python3.9[190946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:48 compute-0 python3.9[191067]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764350567.435179-386-6438566351122/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:49 compute-0 python3.9[191217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:49 compute-0 python3.9[191338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764350568.5582504-386-111358810052504/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:50 compute-0 python3.9[191488]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:50 compute-0 nova_compute[187223]: 2025-11-28 17:22:50.477 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:22:50 compute-0 nova_compute[187223]: 2025-11-28 17:22:50.496 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:22:50 compute-0 python3.9[191640]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:22:52 compute-0 python3.9[191792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:52 compute-0 python3.9[191913]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350571.1614792-504-264897629206285/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:53 compute-0 python3.9[192063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:54 compute-0 python3.9[192139]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:54 compute-0 python3.9[192289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:55 compute-0 python3.9[192410]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350574.4359426-504-47258424830650/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:55 compute-0 sshd-session[189781]: banner exchange: Connection from 118.31.249.253 port 47372: invalid format
Nov 28 17:22:56 compute-0 python3.9[192560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:56 compute-0 python3.9[192681]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350575.6750255-504-261355951117987/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:57 compute-0 python3.9[192831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:58 compute-0 python3.9[192952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350576.8591847-504-70175674307023/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:58 compute-0 python3.9[193102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:22:59 compute-0 python3.9[193223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350578.2252803-504-140649556216554/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:22:59 compute-0 python3.9[193373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:00 compute-0 python3.9[193494]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350579.4614925-504-130437502951201/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:01 compute-0 python3.9[193644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:01 compute-0 python3.9[193765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350580.742176-504-232740023335661/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:02 compute-0 python3.9[193915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:03 compute-0 python3.9[194036]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350582.0130768-504-226448228425620/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:03 compute-0 python3.9[194186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:04 compute-0 python3.9[194307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350583.2237322-504-18285408590164/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:04 compute-0 podman[194308]: 2025-11-28 17:23:04.467222027 +0000 UTC m=+0.104604059 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:23:05 compute-0 python3.9[194476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:05 compute-0 python3.9[194597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350584.5185995-504-187249593151448/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:06 compute-0 python3.9[194747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:06 compute-0 python3.9[194823]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:07 compute-0 python3.9[194973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:07 compute-0 python3.9[195049]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:08 compute-0 python3.9[195199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:09 compute-0 python3.9[195275]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:09 compute-0 podman[195276]: 2025-11-28 17:23:09.396335698 +0000 UTC m=+0.097127530 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 17:23:09 compute-0 sudo[195451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugycscbloyitfpjmfqjqvmlumklfjior ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350589.4780214-882-213521155489112/AnsiballZ_file.py'
Nov 28 17:23:09 compute-0 sudo[195451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:09 compute-0 podman[195453]: 2025-11-28 17:23:09.897842491 +0000 UTC m=+0.092980870 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 17:23:09 compute-0 python3.9[195454]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:10 compute-0 sudo[195451]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:10 compute-0 sudo[195624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtogrtmbjxacawwpdaojdbgftfqxkkfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350590.2410324-898-174850222604609/AnsiballZ_file.py'
Nov 28 17:23:10 compute-0 sudo[195624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:10 compute-0 python3.9[195626]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:10 compute-0 sudo[195624]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:11 compute-0 sudo[195776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjmfhezauxmnzdyhyagymoyrfcdutisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350590.947554-914-249788236094649/AnsiballZ_file.py'
Nov 28 17:23:11 compute-0 sudo[195776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:11 compute-0 python3.9[195778]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:23:11 compute-0 sudo[195776]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:11 compute-0 sudo[195928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtbhhdkswyuvucwbkqxencmlpvqepvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350591.6327822-930-199055456124508/AnsiballZ_systemd_service.py'
Nov 28 17:23:11 compute-0 sudo[195928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:12 compute-0 python3.9[195930]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:23:12 compute-0 systemd[1]: Reloading.
Nov 28 17:23:12 compute-0 systemd-rc-local-generator[195962]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:23:12 compute-0 systemd-sysv-generator[195967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:23:12 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 28 17:23:12 compute-0 sudo[195928]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:13 compute-0 sudo[196120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmjpixhknrzskmsbcwtbxsckwjgdloxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350593.1442685-948-123615356882283/AnsiballZ_stat.py'
Nov 28 17:23:13 compute-0 sudo[196120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:13 compute-0 python3.9[196122]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:13 compute-0 sudo[196120]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:14 compute-0 sudo[196243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saqcffxndtppfncuayxoxyglvasxtbxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350593.1442685-948-123615356882283/AnsiballZ_copy.py'
Nov 28 17:23:14 compute-0 sudo[196243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:14 compute-0 python3.9[196245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350593.1442685-948-123615356882283/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:23:14 compute-0 sudo[196243]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:15 compute-0 sudo[196395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrcxtoruemwlnjhbxwbvxwbpzjdqiqib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350594.6834128-982-206905779846321/AnsiballZ_container_config_data.py'
Nov 28 17:23:15 compute-0 sudo[196395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:15 compute-0 python3.9[196397]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 28 17:23:15 compute-0 sudo[196395]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:16 compute-0 sudo[196547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzpidzovjirpvujcjtqkrxxnutvmxwuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350595.7029798-1000-275660914745469/AnsiballZ_container_config_hash.py'
Nov 28 17:23:16 compute-0 sudo[196547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:16 compute-0 python3.9[196549]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:23:16 compute-0 sudo[196547]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:17 compute-0 sudo[196699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpmwcdktvidevnctweqxqaefufixyzzz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350596.713451-1020-46945218921349/AnsiballZ_edpm_container_manage.py'
Nov 28 17:23:17 compute-0 sudo[196699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:17 compute-0 python3[196701]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.688 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.688 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.689 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.709 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.709 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.711 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.711 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.711 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.711 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.743 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.744 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.744 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.744 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.920 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.922 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6162MB free_disk=73.54412841796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.922 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.923 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.987 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:23:17 compute-0 nova_compute[187223]: 2025-11-28 17:23:17.987 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:23:18 compute-0 nova_compute[187223]: 2025-11-28 17:23:18.022 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:23:18 compute-0 nova_compute[187223]: 2025-11-28 17:23:18.037 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:23:18 compute-0 nova_compute[187223]: 2025-11-28 17:23:18.039 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:23:18 compute-0 nova_compute[187223]: 2025-11-28 17:23:18.039 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:23:19 compute-0 podman[196713]: 2025-11-28 17:23:19.077877925 +0000 UTC m=+1.449797525 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 17:23:19 compute-0 podman[196807]: 2025-11-28 17:23:19.223319563 +0000 UTC m=+0.048760102 container create d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:23:19 compute-0 podman[196807]: 2025-11-28 17:23:19.197969734 +0000 UTC m=+0.023410303 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 17:23:19 compute-0 python3[196701]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 28 17:23:19 compute-0 sudo[196699]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:19 compute-0 sudo[196995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uttxbaqbidiotloixdryqgiysycgprbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350599.5753276-1036-102416637327700/AnsiballZ_stat.py'
Nov 28 17:23:19 compute-0 sudo[196995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:20 compute-0 python3.9[196997]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:23:20 compute-0 sudo[196995]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:20 compute-0 sudo[197149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dotxjiyqobakzbqvglsxwilawyibttes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350600.352383-1054-50155836446240/AnsiballZ_file.py'
Nov 28 17:23:20 compute-0 sudo[197149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:20 compute-0 python3.9[197151]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:20 compute-0 sudo[197149]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:21 compute-0 sudo[197300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnchfgtnabsypapmxmbbskgnrrazdqlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350600.9951663-1054-225624735462367/AnsiballZ_copy.py'
Nov 28 17:23:21 compute-0 sudo[197300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:21 compute-0 python3.9[197302]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350600.9951663-1054-225624735462367/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:21 compute-0 sudo[197300]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:22 compute-0 sudo[197376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrfbnlcpqrxbwrbnjjyrrjaanvtovfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350600.9951663-1054-225624735462367/AnsiballZ_systemd.py'
Nov 28 17:23:22 compute-0 sudo[197376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:22 compute-0 python3.9[197378]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:23:22 compute-0 systemd[1]: Reloading.
Nov 28 17:23:22 compute-0 systemd-rc-local-generator[197405]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:23:22 compute-0 systemd-sysv-generator[197409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:23:22 compute-0 sudo[197376]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:22 compute-0 sudo[197486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgzaeepxugrpiqszntmhxxhcijkvaekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350600.9951663-1054-225624735462367/AnsiballZ_systemd.py'
Nov 28 17:23:22 compute-0 sudo[197486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:23 compute-0 python3.9[197488]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:23:23 compute-0 systemd[1]: Reloading.
Nov 28 17:23:23 compute-0 systemd-rc-local-generator[197521]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:23:23 compute-0 systemd-sysv-generator[197525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:23:23 compute-0 systemd[1]: Starting podman_exporter container...
Nov 28 17:23:23 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:23:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b0a6763e0317bccad74a949392d0304fbfed77516efcd42ebe46dcabe62606/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b0a6763e0317bccad74a949392d0304fbfed77516efcd42ebe46dcabe62606/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.
Nov 28 17:23:23 compute-0 podman[197529]: 2025-11-28 17:23:23.799630344 +0000 UTC m=+0.148225330 container init d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.817Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.817Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.818Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.818Z caller=handler.go:105 level=info collector=container
Nov 28 17:23:23 compute-0 podman[197529]: 2025-11-28 17:23:23.831498552 +0000 UTC m=+0.180093488 container start d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:23:23 compute-0 podman[197529]: podman_exporter
Nov 28 17:23:23 compute-0 systemd[1]: Starting Podman API Service...
Nov 28 17:23:23 compute-0 systemd[1]: Started Podman API Service.
Nov 28 17:23:23 compute-0 systemd[1]: Started podman_exporter container.
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="Setting parallel job count to 25"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="Using sqlite as database backend"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 28 17:23:23 compute-0 podman[197556]: @ - - [28/Nov/2025:17:23:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 17:23:23 compute-0 podman[197556]: time="2025-11-28T17:23:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:23:23 compute-0 sudo[197486]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:23 compute-0 podman[197556]: @ - - [28/Nov/2025:17:23:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14064 "" "Go-http-client/1.1"
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.909Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.911Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 17:23:23 compute-0 podman_exporter[197545]: ts=2025-11-28T17:23:23.911Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 17:23:23 compute-0 podman[197554]: 2025-11-28 17:23:23.913526562 +0000 UTC m=+0.071483013 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:23:23 compute-0 systemd[1]: d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf-178e20228e1e077f.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 17:23:23 compute-0 systemd[1]: d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf-178e20228e1e077f.service: Failed with result 'exit-code'.
Nov 28 17:23:24 compute-0 sudo[197739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rofwpcljslhjkwpfnjvyvrdsxeaefzhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350604.099751-1102-77445713890560/AnsiballZ_systemd.py'
Nov 28 17:23:24 compute-0 sudo[197739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:24 compute-0 python3.9[197741]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:23:24 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 28 17:23:24 compute-0 podman[197556]: @ - - [28/Nov/2025:17:23:23 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 28 17:23:24 compute-0 systemd[1]: libpod-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.scope: Deactivated successfully.
Nov 28 17:23:24 compute-0 podman[197745]: 2025-11-28 17:23:24.845288552 +0000 UTC m=+0.067104196 container died d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:23:24 compute-0 systemd[1]: d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf-178e20228e1e077f.timer: Deactivated successfully.
Nov 28 17:23:24 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.
Nov 28 17:23:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf-userdata-shm.mount: Deactivated successfully.
Nov 28 17:23:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-f2b0a6763e0317bccad74a949392d0304fbfed77516efcd42ebe46dcabe62606-merged.mount: Deactivated successfully.
Nov 28 17:23:25 compute-0 podman[197745]: 2025-11-28 17:23:25.279991618 +0000 UTC m=+0.501807252 container cleanup d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:23:25 compute-0 podman[197745]: podman_exporter
Nov 28 17:23:25 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 17:23:25 compute-0 podman[197774]: podman_exporter
Nov 28 17:23:25 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 28 17:23:25 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 28 17:23:25 compute-0 systemd[1]: Starting podman_exporter container...
Nov 28 17:23:25 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b0a6763e0317bccad74a949392d0304fbfed77516efcd42ebe46dcabe62606/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2b0a6763e0317bccad74a949392d0304fbfed77516efcd42ebe46dcabe62606/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:25 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.
Nov 28 17:23:25 compute-0 podman[197787]: 2025-11-28 17:23:25.541087006 +0000 UTC m=+0.135829299 container init d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.559Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.559Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.559Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.559Z caller=handler.go:105 level=info collector=container
Nov 28 17:23:25 compute-0 podman[197556]: @ - - [28/Nov/2025:17:23:25 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 17:23:25 compute-0 podman[197556]: time="2025-11-28T17:23:25Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:23:25 compute-0 podman[197787]: 2025-11-28 17:23:25.574673514 +0000 UTC m=+0.169415797 container start d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:23:25 compute-0 podman[197787]: podman_exporter
Nov 28 17:23:25 compute-0 podman[197556]: @ - - [28/Nov/2025:17:23:25 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14066 "" "Go-http-client/1.1"
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.581Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.583Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 17:23:25 compute-0 podman_exporter[197803]: ts=2025-11-28T17:23:25.584Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 17:23:25 compute-0 systemd[1]: Started podman_exporter container.
Nov 28 17:23:25 compute-0 sudo[197739]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:25 compute-0 podman[197813]: 2025-11-28 17:23:25.64589696 +0000 UTC m=+0.057191158 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:23:26 compute-0 sudo[197985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfgngejkwejfadymyxexphsjofqagidl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350605.7971246-1118-147589120014221/AnsiballZ_stat.py'
Nov 28 17:23:26 compute-0 sudo[197985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:26 compute-0 python3.9[197987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:23:26 compute-0 sudo[197985]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:26 compute-0 sudo[198108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alywsmxvetqeiqwdqzuwushyvhtkhozv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350605.7971246-1118-147589120014221/AnsiballZ_copy.py'
Nov 28 17:23:26 compute-0 sudo[198108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:26 compute-0 python3.9[198110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764350605.7971246-1118-147589120014221/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 17:23:27 compute-0 sudo[198108]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:23:27.666 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:23:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:23:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:23:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:23:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:23:27 compute-0 sudo[198260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsdffxxgptvfkgdzmnrlrrnfaxzfgwyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350607.410375-1152-26847585524105/AnsiballZ_container_config_data.py'
Nov 28 17:23:27 compute-0 sudo[198260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:27 compute-0 python3.9[198262]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 28 17:23:27 compute-0 sudo[198260]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:28 compute-0 sudo[198412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbrybjoyfaiajojyoxyraotrtrkusfne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350608.2683952-1170-188337390854901/AnsiballZ_container_config_hash.py'
Nov 28 17:23:28 compute-0 sudo[198412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:28 compute-0 python3.9[198414]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 17:23:28 compute-0 sudo[198412]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:29 compute-0 auditd[704]: Audit daemon rotating log files
Nov 28 17:23:29 compute-0 sudo[198564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bymvmuwixagocqvwvttymndmfiozavtm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350609.1479607-1190-69738937991632/AnsiballZ_edpm_container_manage.py'
Nov 28 17:23:29 compute-0 sudo[198564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:29 compute-0 python3[198566]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 17:23:33 compute-0 podman[198578]: 2025-11-28 17:23:33.985236967 +0000 UTC m=+3.939173819 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 17:23:34 compute-0 podman[198673]: 2025-11-28 17:23:34.133637291 +0000 UTC m=+0.046603919 container create 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm)
Nov 28 17:23:34 compute-0 podman[198673]: 2025-11-28 17:23:34.108639753 +0000 UTC m=+0.021606401 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 17:23:34 compute-0 python3[198566]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 17:23:34 compute-0 sudo[198564]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:34 compute-0 sudo[198871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhxrykgdvhpuqhbydcebecsebrjgvhdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350614.4733558-1206-7489073064327/AnsiballZ_stat.py'
Nov 28 17:23:34 compute-0 sudo[198871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:34 compute-0 podman[198836]: 2025-11-28 17:23:34.824644975 +0000 UTC m=+0.078922841 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:23:34 compute-0 python3.9[198874]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:23:35 compute-0 sudo[198871]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:35 compute-0 sudo[199033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fauulymbaqbzvbeyvalvouqgagchmcww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350615.296483-1224-13145717011880/AnsiballZ_file.py'
Nov 28 17:23:35 compute-0 sudo[199033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:35 compute-0 python3.9[199035]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:35 compute-0 sudo[199033]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:36 compute-0 sudo[199184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufidhxgokzhldismxkpgyopbnnzjvdle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350615.918635-1224-108130596286856/AnsiballZ_copy.py'
Nov 28 17:23:36 compute-0 sudo[199184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:36 compute-0 python3.9[199186]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764350615.918635-1224-108130596286856/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:23:36 compute-0 sudo[199184]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:36 compute-0 sudo[199260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emoyjcexmqktcurpunhlmyaqznrelytw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350615.918635-1224-108130596286856/AnsiballZ_systemd.py'
Nov 28 17:23:36 compute-0 sudo[199260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:37 compute-0 python3.9[199262]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 17:23:37 compute-0 systemd[1]: Reloading.
Nov 28 17:23:37 compute-0 systemd-sysv-generator[199291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:23:37 compute-0 systemd-rc-local-generator[199287]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:23:37 compute-0 sudo[199260]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:37 compute-0 sudo[199370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwbwsyxtjporynmnadbitseavxjsiefi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350615.918635-1224-108130596286856/AnsiballZ_systemd.py'
Nov 28 17:23:37 compute-0 sudo[199370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:38 compute-0 python3.9[199372]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 17:23:38 compute-0 systemd[1]: Reloading.
Nov 28 17:23:38 compute-0 systemd-rc-local-generator[199400]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 17:23:38 compute-0 systemd-sysv-generator[199403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 17:23:38 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 28 17:23:38 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:23:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.
Nov 28 17:23:38 compute-0 podman[199411]: 2025-11-28 17:23:38.532740989 +0000 UTC m=+0.153715360 container init 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350)
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *bridge.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *coverage.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *datapath.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *iface.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *memory.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *ovnnorthd.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *ovn.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *ovsdbserver.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *pmd_perf.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *pmd_rxq.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: INFO    17:23:38 main.go:48: registering *vswitch.Collector
Nov 28 17:23:38 compute-0 openstack_network_exporter[199427]: NOTICE  17:23:38 main.go:76: listening on https://:9105/metrics
Nov 28 17:23:38 compute-0 podman[199411]: 2025-11-28 17:23:38.561641951 +0000 UTC m=+0.182616302 container start 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 28 17:23:38 compute-0 podman[199411]: openstack_network_exporter
Nov 28 17:23:38 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 28 17:23:38 compute-0 sudo[199370]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:38 compute-0 podman[199433]: 2025-11-28 17:23:38.661268824 +0000 UTC m=+0.090240920 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 17:23:39 compute-0 sudo[199609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbnfxuxnxftmzzamyxrngurhktnjgfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350618.819593-1272-199051398585481/AnsiballZ_systemd.py'
Nov 28 17:23:39 compute-0 sudo[199609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:39 compute-0 python3.9[199611]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 17:23:39 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 28 17:23:39 compute-0 systemd[1]: libpod-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.scope: Deactivated successfully.
Nov 28 17:23:39 compute-0 podman[199621]: 2025-11-28 17:23:39.562061671 +0000 UTC m=+0.057192347 container died 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Nov 28 17:23:39 compute-0 systemd[1]: 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882-3b9c38beaa036cb3.timer: Deactivated successfully.
Nov 28 17:23:39 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.
Nov 28 17:23:39 compute-0 podman[199613]: 2025-11-28 17:23:39.589058338 +0000 UTC m=+0.105496425 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 17:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882-userdata-shm.mount: Deactivated successfully.
Nov 28 17:23:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac-merged.mount: Deactivated successfully.
Nov 28 17:23:40 compute-0 podman[199668]: 2025-11-28 17:23:40.211873095 +0000 UTC m=+0.072870215 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:23:40 compute-0 podman[199621]: 2025-11-28 17:23:40.806696176 +0000 UTC m=+1.301826852 container cleanup 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 28 17:23:40 compute-0 podman[199621]: openstack_network_exporter
Nov 28 17:23:40 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 17:23:40 compute-0 podman[199688]: openstack_network_exporter
Nov 28 17:23:40 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 28 17:23:40 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 28 17:23:40 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 28 17:23:40 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:23:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4e1113bd6f42aa50a81afa968931ad1eda6de7028523a4f773a4e232b6f52ac/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 17:23:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.
Nov 28 17:23:41 compute-0 podman[199701]: 2025-11-28 17:23:41.04911226 +0000 UTC m=+0.139974750 container init 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_id=edpm)
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *bridge.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *coverage.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *datapath.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *iface.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *memory.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *ovnnorthd.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *ovn.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *ovsdbserver.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *pmd_perf.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *pmd_rxq.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: INFO    17:23:41 main.go:48: registering *vswitch.Collector
Nov 28 17:23:41 compute-0 openstack_network_exporter[199717]: NOTICE  17:23:41 main.go:76: listening on https://:9105/metrics
Nov 28 17:23:41 compute-0 podman[199701]: 2025-11-28 17:23:41.0820475 +0000 UTC m=+0.172909950 container start 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 17:23:41 compute-0 podman[199701]: openstack_network_exporter
Nov 28 17:23:41 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 28 17:23:41 compute-0 sudo[199609]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:41 compute-0 podman[199727]: 2025-11-28 17:23:41.161800173 +0000 UTC m=+0.066825638 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal)
Nov 28 17:23:41 compute-0 sudo[199899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwpcobwhgrkjavextrntwgwixdhgqdcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350621.3337433-1288-122150710171053/AnsiballZ_find.py'
Nov 28 17:23:41 compute-0 sudo[199899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:42 compute-0 python3.9[199901]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 17:23:42 compute-0 sudo[199899]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:56 compute-0 podman[199926]: 2025-11-28 17:23:56.195437104 +0000 UTC m=+0.053170510 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:23:58 compute-0 sudo[200074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loluqkpxkxmvycpukgqivxlrmmsdzyif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350638.1457775-1472-237784274008922/AnsiballZ_podman_container_info.py'
Nov 28 17:23:58 compute-0 sudo[200074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:58 compute-0 python3.9[200076]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 28 17:23:58 compute-0 sudo[200074]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:59 compute-0 sudo[200240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouwhdfqiqbhhoioyolfkkirwgfkevnau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350638.9206467-1480-245114496851549/AnsiballZ_podman_container_exec.py'
Nov 28 17:23:59 compute-0 sudo[200240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:23:59 compute-0 python3.9[200242]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:23:59 compute-0 systemd[1]: Started libpod-conmon-28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc.scope.
Nov 28 17:23:59 compute-0 podman[200243]: 2025-11-28 17:23:59.654952816 +0000 UTC m=+0.123412257 container exec 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 17:23:59 compute-0 podman[200243]: 2025-11-28 17:23:59.667280485 +0000 UTC m=+0.135739906 container exec_died 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 17:23:59 compute-0 sudo[200240]: pam_unix(sudo:session): session closed for user root
Nov 28 17:23:59 compute-0 systemd[1]: libpod-conmon-28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc.scope: Deactivated successfully.
Nov 28 17:24:00 compute-0 sudo[200425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwhkihhwtlvsdggdrrhwrfmxowcllgnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350639.8718822-1488-228514303958543/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:00 compute-0 sudo[200425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:00 compute-0 python3.9[200427]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:00 compute-0 systemd[1]: Started libpod-conmon-28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc.scope.
Nov 28 17:24:00 compute-0 podman[200428]: 2025-11-28 17:24:00.553080445 +0000 UTC m=+0.098866372 container exec 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:24:00 compute-0 podman[200428]: 2025-11-28 17:24:00.589234169 +0000 UTC m=+0.135020076 container exec_died 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:24:00 compute-0 systemd[1]: libpod-conmon-28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc.scope: Deactivated successfully.
Nov 28 17:24:00 compute-0 sudo[200425]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:01 compute-0 sudo[200608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zppljwdskxsngnchxzmhewoylzcpwiav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350640.8323092-1496-256314696392845/AnsiballZ_file.py'
Nov 28 17:24:01 compute-0 sudo[200608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:01 compute-0 python3.9[200610]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:01 compute-0 sudo[200608]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:02 compute-0 sudo[200760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivyfjmbgltktanpdajccgntzlatjasfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350641.6959689-1505-75856350000385/AnsiballZ_podman_container_info.py'
Nov 28 17:24:02 compute-0 sudo[200760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:02 compute-0 python3.9[200762]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 28 17:24:02 compute-0 sudo[200760]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:02 compute-0 sudo[200925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpusrvzgejzbszsdunwtyeruqncvmbfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350642.5340307-1513-245170430060305/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:02 compute-0 sudo[200925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:03 compute-0 python3.9[200927]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:03 compute-0 systemd[1]: Started libpod-conmon-d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1.scope.
Nov 28 17:24:03 compute-0 podman[200928]: 2025-11-28 17:24:03.18592682 +0000 UTC m=+0.103877158 container exec d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:24:03 compute-0 podman[200928]: 2025-11-28 17:24:03.222362921 +0000 UTC m=+0.140313199 container exec_died d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:24:03 compute-0 systemd[1]: libpod-conmon-d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1.scope: Deactivated successfully.
Nov 28 17:24:03 compute-0 sudo[200925]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:03 compute-0 sudo[201106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyengawiglbwmgizwuvzjunfkugifvzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350643.459269-1521-150296545795052/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:03 compute-0 sudo[201106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:04 compute-0 python3.9[201108]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:04 compute-0 systemd[1]: Started libpod-conmon-d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1.scope.
Nov 28 17:24:04 compute-0 podman[201109]: 2025-11-28 17:24:04.148161577 +0000 UTC m=+0.116399073 container exec d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 17:24:04 compute-0 podman[201109]: 2025-11-28 17:24:04.182367454 +0000 UTC m=+0.150604950 container exec_died d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:24:04 compute-0 systemd[1]: libpod-conmon-d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1.scope: Deactivated successfully.
Nov 28 17:24:04 compute-0 sudo[201106]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:04 compute-0 sudo[201287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsfrsyfjezzkuckglinioiqxkbqxnjrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350644.4301-1529-172598064248194/AnsiballZ_file.py'
Nov 28 17:24:04 compute-0 sudo[201287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:04 compute-0 python3.9[201289]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:04 compute-0 sudo[201287]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:05 compute-0 podman[201314]: 2025-11-28 17:24:05.236716175 +0000 UTC m=+0.097675497 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 17:24:05 compute-0 sudo[201459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihzezmlcgbgxqkbnsaubihoyxlgqcsdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350645.2346098-1538-92367602644164/AnsiballZ_podman_container_info.py'
Nov 28 17:24:05 compute-0 sudo[201459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:05 compute-0 python3.9[201461]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 28 17:24:05 compute-0 sudo[201459]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:06 compute-0 sudo[201624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcpixwozgzboorqyyqgoybstcozeljvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350646.006728-1546-58287800734776/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:06 compute-0 sudo[201624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:06 compute-0 python3.9[201626]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:06 compute-0 systemd[1]: Started libpod-conmon-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.scope.
Nov 28 17:24:06 compute-0 podman[201627]: 2025-11-28 17:24:06.664895788 +0000 UTC m=+0.089268831 container exec 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 17:24:06 compute-0 podman[201627]: 2025-11-28 17:24:06.700239758 +0000 UTC m=+0.124612771 container exec_died 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 17:24:06 compute-0 systemd[1]: libpod-conmon-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.scope: Deactivated successfully.
Nov 28 17:24:06 compute-0 sudo[201624]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:07 compute-0 sudo[201808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpelquftbtxpibbepaqmmtnceqlkvxda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350646.9507153-1554-14664455133570/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:07 compute-0 sudo[201808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:07 compute-0 python3.9[201810]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:07 compute-0 systemd[1]: Started libpod-conmon-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.scope.
Nov 28 17:24:07 compute-0 podman[201811]: 2025-11-28 17:24:07.571182355 +0000 UTC m=+0.075271674 container exec 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 17:24:07 compute-0 podman[201811]: 2025-11-28 17:24:07.606368 +0000 UTC m=+0.110457319 container exec_died 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 17:24:07 compute-0 systemd[1]: libpod-conmon-077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176.scope: Deactivated successfully.
Nov 28 17:24:07 compute-0 sudo[201808]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:08 compute-0 sudo[201992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamahrdkiumfarwokbusgnwyxmghgvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350647.834554-1562-178001599090758/AnsiballZ_file.py'
Nov 28 17:24:08 compute-0 sudo[201992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:08 compute-0 python3.9[201994]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:08 compute-0 sudo[201992]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:08 compute-0 sudo[202144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlvorqirghxyvzqlwcmbwtjtezdwgirs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350648.5828083-1571-125428809017380/AnsiballZ_podman_container_info.py'
Nov 28 17:24:08 compute-0 sudo[202144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:09 compute-0 python3.9[202146]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 28 17:24:09 compute-0 sudo[202144]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:09 compute-0 sudo[202318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyecupfirxlwzgsmrikrnycwoxhsyczg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350649.406068-1579-103815449230773/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:09 compute-0 sudo[202318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:09 compute-0 podman[202282]: 2025-11-28 17:24:09.899632581 +0000 UTC m=+0.162965150 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:24:10 compute-0 python3.9[202323]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:10 compute-0 systemd[1]: Started libpod-conmon-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.scope.
Nov 28 17:24:10 compute-0 podman[202337]: 2025-11-28 17:24:10.148471871 +0000 UTC m=+0.094506634 container exec d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:24:10 compute-0 podman[202337]: 2025-11-28 17:24:10.179159005 +0000 UTC m=+0.125193758 container exec_died d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:24:10 compute-0 systemd[1]: libpod-conmon-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.scope: Deactivated successfully.
Nov 28 17:24:10 compute-0 sudo[202318]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:10 compute-0 sudo[202515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysvivurfzsylacabljnwlcukqooinkab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350650.3837729-1587-196906933913023/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:10 compute-0 sudo[202515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:10 compute-0 python3.9[202517]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:10 compute-0 systemd[1]: Started libpod-conmon-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.scope.
Nov 28 17:24:11 compute-0 podman[202518]: 2025-11-28 17:24:10.999304632 +0000 UTC m=+0.113144778 container exec d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:24:11 compute-0 podman[202518]: 2025-11-28 17:24:11.033836988 +0000 UTC m=+0.147677134 container exec_died d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:24:11 compute-0 systemd[1]: libpod-conmon-d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf.scope: Deactivated successfully.
Nov 28 17:24:11 compute-0 sudo[202515]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:11 compute-0 podman[202534]: 2025-11-28 17:24:11.090593972 +0000 UTC m=+0.090929161 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:24:11 compute-0 sudo[202732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwapzmlvpsfupbknetxjxplnoktwtddw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350651.682564-1595-70871553674050/AnsiballZ_file.py'
Nov 28 17:24:11 compute-0 sudo[202732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:11 compute-0 podman[202694]: 2025-11-28 17:24:11.969071129 +0000 UTC m=+0.064671586 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Nov 28 17:24:12 compute-0 python3.9[202740]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:12 compute-0 sudo[202732]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:12 compute-0 sudo[202890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxznxhortsuugvvwwhpjskqbzsajpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350652.4583998-1604-71642930047759/AnsiballZ_podman_container_info.py'
Nov 28 17:24:12 compute-0 sudo[202890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:12 compute-0 python3.9[202892]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 28 17:24:13 compute-0 sudo[202890]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:14 compute-0 sudo[203055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lytzgwvdmkgoymluigihcquyqyylwsrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350653.6825829-1612-123898087831774/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:14 compute-0 sudo[203055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:14 compute-0 python3.9[203057]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:14 compute-0 systemd[1]: Started libpod-conmon-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.scope.
Nov 28 17:24:14 compute-0 podman[203058]: 2025-11-28 17:24:14.446131484 +0000 UTC m=+0.107770071 container exec 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Nov 28 17:24:14 compute-0 podman[203058]: 2025-11-28 17:24:14.48409586 +0000 UTC m=+0.145734347 container exec_died 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 17:24:14 compute-0 systemd[1]: libpod-conmon-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.scope: Deactivated successfully.
Nov 28 17:24:14 compute-0 sudo[203055]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:15 compute-0 sudo[203239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iftmpcoynbhftebhqixksrcfelbhfnam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350654.7063835-1620-181198219345595/AnsiballZ_podman_container_exec.py'
Nov 28 17:24:15 compute-0 sudo[203239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:15 compute-0 python3.9[203241]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 17:24:15 compute-0 systemd[1]: Started libpod-conmon-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.scope.
Nov 28 17:24:15 compute-0 podman[203242]: 2025-11-28 17:24:15.324408825 +0000 UTC m=+0.078643033 container exec 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, version=9.6, distribution-scope=public)
Nov 28 17:24:15 compute-0 podman[203242]: 2025-11-28 17:24:15.357216821 +0000 UTC m=+0.111451029 container exec_died 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64)
Nov 28 17:24:15 compute-0 systemd[1]: libpod-conmon-6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882.scope: Deactivated successfully.
Nov 28 17:24:15 compute-0 sudo[203239]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:15 compute-0 sudo[203421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fadbrndmrqzygzbgnxigzvnzezaegcrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350655.6776218-1628-80400280493616/AnsiballZ_file.py'
Nov 28 17:24:15 compute-0 sudo[203421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:16 compute-0 python3.9[203423]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:16 compute-0 sudo[203421]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:16 compute-0 sudo[203573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvjyzuibehtgvjcmdzmmarcetyfftvxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350656.4156713-1638-186204834715976/AnsiballZ_file.py'
Nov 28 17:24:16 compute-0 sudo[203573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:16 compute-0 python3.9[203575]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:16 compute-0 sudo[203573]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:17 compute-0 sudo[203725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmvnznlaozvmlymmukytgiaiaofjjcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350657.155264-1654-135143842931037/AnsiballZ_stat.py'
Nov 28 17:24:17 compute-0 sudo[203725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:17 compute-0 python3.9[203727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:17 compute-0 sudo[203725]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:18 compute-0 sudo[203848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyvfallcdafjikvgfuxuddytkrjnmlwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350657.155264-1654-135143842931037/AnsiballZ_copy.py'
Nov 28 17:24:18 compute-0 sudo[203848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.030 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.172 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.173 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.173 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.243 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.244 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.245 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.245 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.245 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:24:18 compute-0 python3.9[203850]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764350657.155264-1654-135143842931037/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:18 compute-0 sudo[203848]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 nova_compute[187223]: 2025-11-28 17:24:18.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:18 compute-0 sudo[204000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcmsdglezsaqnzrviolijyqxvwpbqemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350658.6194103-1686-10695241625179/AnsiballZ_file.py'
Nov 28 17:24:18 compute-0 sudo[204000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:19 compute-0 python3.9[204002]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:19 compute-0 sudo[204000]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:19 compute-0 sudo[204152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwbwhczsmjfsewnaxwyhclkgjwunbchh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350659.337681-1702-254754747411189/AnsiballZ_stat.py'
Nov 28 17:24:19 compute-0 sudo[204152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.712 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:24:19 compute-0 python3.9[204154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.904 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.905 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6048MB free_disk=73.37567520141602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.905 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.906 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:24:19 compute-0 sudo[204152]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.971 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.971 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:24:19 compute-0 nova_compute[187223]: 2025-11-28 17:24:19.996 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:24:20 compute-0 nova_compute[187223]: 2025-11-28 17:24:20.011 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:24:20 compute-0 nova_compute[187223]: 2025-11-28 17:24:20.013 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:24:20 compute-0 nova_compute[187223]: 2025-11-28 17:24:20.013 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:24:20 compute-0 sudo[204230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmotcsdqwkrausmlkgavmjawblhhgrve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350659.337681-1702-254754747411189/AnsiballZ_file.py'
Nov 28 17:24:20 compute-0 sudo[204230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:20 compute-0 python3.9[204232]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:20 compute-0 sudo[204230]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:20 compute-0 sudo[204382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaotlqzxcrsjlrpxqoolzssdyzijmcfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350660.5813496-1726-148280465535507/AnsiballZ_stat.py'
Nov 28 17:24:20 compute-0 sudo[204382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:21 compute-0 python3.9[204384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:21 compute-0 sudo[204382]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:21 compute-0 sudo[204460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rearbudzqxlcnylcpebfvjnsaouojmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350660.5813496-1726-148280465535507/AnsiballZ_file.py'
Nov 28 17:24:21 compute-0 sudo[204460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:21 compute-0 python3.9[204462]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tdjop8y8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:21 compute-0 sudo[204460]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:22 compute-0 sudo[204612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljvwjgatstdtsbabmnfqizikxvnklxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350661.8974802-1750-253543576372492/AnsiballZ_stat.py'
Nov 28 17:24:22 compute-0 sudo[204612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:22 compute-0 python3.9[204614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:22 compute-0 sudo[204612]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:22 compute-0 sudo[204690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcnawjpivuikeldhaoqkmraezicmgkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350661.8974802-1750-253543576372492/AnsiballZ_file.py'
Nov 28 17:24:22 compute-0 sudo[204690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:22 compute-0 python3.9[204692]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:22 compute-0 sudo[204690]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:23 compute-0 sudo[204842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfrtprowojzfuwdloituyzoymiaqkct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350663.1168709-1776-232678291966024/AnsiballZ_command.py'
Nov 28 17:24:23 compute-0 sudo[204842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:23 compute-0 python3.9[204844]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:24:23 compute-0 sudo[204842]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:24 compute-0 sudo[204995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdygpbakygjzgviziehxlyxgwdbpigq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764350663.8860643-1792-277081405891609/AnsiballZ_edpm_nftables_from_files.py'
Nov 28 17:24:24 compute-0 sudo[204995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:24 compute-0 python3[204997]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 17:24:24 compute-0 sudo[204995]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:25 compute-0 sudo[205147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsydsblwppoikdjuhvyidwfptpjttnsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350664.87023-1808-209118853270296/AnsiballZ_stat.py'
Nov 28 17:24:25 compute-0 sudo[205147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:25 compute-0 python3.9[205149]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:25 compute-0 sudo[205147]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:25 compute-0 sudo[205225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmydltqjhtrghthfassgivvfmznoynoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350664.87023-1808-209118853270296/AnsiballZ_file.py'
Nov 28 17:24:25 compute-0 sudo[205225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:25 compute-0 python3.9[205227]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:25 compute-0 sudo[205225]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:26 compute-0 sudo[205388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byocfqndcqvnolbfhwrycpsasisemcdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350666.147555-1832-26125204177937/AnsiballZ_stat.py'
Nov 28 17:24:26 compute-0 sudo[205388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:26 compute-0 podman[205351]: 2025-11-28 17:24:26.542828262 +0000 UTC m=+0.089550921 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:24:26 compute-0 python3.9[205393]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:26 compute-0 sudo[205388]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:26 compute-0 sudo[205479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wefrhdviqrplrfgxepnufaccguroesav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350666.147555-1832-26125204177937/AnsiballZ_file.py'
Nov 28 17:24:26 compute-0 sudo[205479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:27 compute-0 python3.9[205481]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:27 compute-0 sudo[205479]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:24:27.667 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:24:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:24:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:24:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:24:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:24:27 compute-0 sudo[205631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umswtotbzxsrkhnaemrsxypbdtoocpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350667.3945923-1856-251249956305571/AnsiballZ_stat.py'
Nov 28 17:24:27 compute-0 sudo[205631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:27 compute-0 python3.9[205633]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:27 compute-0 sudo[205631]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:28 compute-0 sudo[205709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlizslyolrcjdmdkrjfuwvmwallksnug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350667.3945923-1856-251249956305571/AnsiballZ_file.py'
Nov 28 17:24:28 compute-0 sudo[205709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:28 compute-0 python3.9[205711]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:28 compute-0 sudo[205709]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:28 compute-0 sudo[205861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlymlzbdjvalnieshohkctxttixhhxqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350668.6101267-1880-260488461176202/AnsiballZ_stat.py'
Nov 28 17:24:28 compute-0 sudo[205861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:29 compute-0 python3.9[205863]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:29 compute-0 sudo[205861]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:29 compute-0 sudo[205939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrsewobwfvisamdchmcnbxyqkjeqxciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350668.6101267-1880-260488461176202/AnsiballZ_file.py'
Nov 28 17:24:29 compute-0 sudo[205939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:29 compute-0 python3.9[205941]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:29 compute-0 sudo[205939]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:30 compute-0 sudo[206091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrahpjqjrfsegbwdkggiqhtnxkadliud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350669.829297-1904-427047063065/AnsiballZ_stat.py'
Nov 28 17:24:30 compute-0 sudo[206091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:30 compute-0 python3.9[206093]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 17:24:30 compute-0 sudo[206091]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:30 compute-0 sudo[206216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwiyhkrwirtfupwazzdwjydwwfmuxqvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350669.829297-1904-427047063065/AnsiballZ_copy.py'
Nov 28 17:24:30 compute-0 sudo[206216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:31 compute-0 python3.9[206218]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764350669.829297-1904-427047063065/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:31 compute-0 sudo[206216]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:31 compute-0 sudo[206368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwayufkfvehvsisquepwoqpkpkkbtcsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350671.2841907-1934-254828767641850/AnsiballZ_file.py'
Nov 28 17:24:31 compute-0 sudo[206368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:31 compute-0 python3.9[206370]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:31 compute-0 sudo[206368]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:32 compute-0 sudo[206520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yypsbfghivtglcieouwdesmptuaurpjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350672.0759182-1950-68554936927307/AnsiballZ_command.py'
Nov 28 17:24:32 compute-0 sudo[206520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:32 compute-0 python3.9[206522]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:24:32 compute-0 sudo[206520]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:33 compute-0 sudo[206675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymzywdybssrpzdfzdlnfczuvjieboqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350672.7963288-1966-188510930283797/AnsiballZ_blockinfile.py'
Nov 28 17:24:33 compute-0 sudo[206675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:33 compute-0 python3.9[206677]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:33 compute-0 sudo[206675]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:34 compute-0 sudo[206827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dscmwxahlvkoghgbcnhasqdxmzmdwxyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350673.9091175-1984-217392800190255/AnsiballZ_command.py'
Nov 28 17:24:34 compute-0 sudo[206827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:34 compute-0 python3.9[206829]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:24:34 compute-0 sudo[206827]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:35 compute-0 sudo[206980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejsdzwirciwlicdmieytxyhphiszefgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350674.8877084-2000-356889620909/AnsiballZ_stat.py'
Nov 28 17:24:35 compute-0 sudo[206980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:35 compute-0 python3.9[206982]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 17:24:35 compute-0 sudo[206980]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:36 compute-0 sudo[207145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzejdgolbsrkgjyipaylxorqlpbxwsmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350675.7122245-2016-187384325898271/AnsiballZ_command.py'
Nov 28 17:24:36 compute-0 sudo[207145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:36 compute-0 podman[207108]: 2025-11-28 17:24:36.101849867 +0000 UTC m=+0.081906928 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 17:24:36 compute-0 python3.9[207151]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 17:24:36 compute-0 sudo[207145]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:36 compute-0 sudo[207308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccoqfzxpqennekmpcdxppiflqfjftomd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764350676.5189648-2032-145329245042179/AnsiballZ_file.py'
Nov 28 17:24:36 compute-0 sudo[207308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 17:24:36 compute-0 podman[197556]: time="2025-11-28T17:24:36Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:24:36 compute-0 podman[197556]: @ - - [28/Nov/2025:17:24:36 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:24:36 compute-0 podman[197556]: @ - - [28/Nov/2025:17:24:36 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2563 "" "Go-http-client/1.1"
Nov 28 17:24:37 compute-0 python3.9[207310]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 17:24:37 compute-0 sudo[207308]: pam_unix(sudo:session): session closed for user root
Nov 28 17:24:37 compute-0 sshd-session[187522]: Connection closed by 192.168.122.30 port 35150
Nov 28 17:24:37 compute-0 sshd-session[187519]: pam_unix(sshd:session): session closed for user zuul
Nov 28 17:24:37 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 28 17:24:37 compute-0 systemd[1]: session-26.scope: Consumed 1min 29.420s CPU time.
Nov 28 17:24:37 compute-0 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Nov 28 17:24:37 compute-0 systemd-logind[788]: Removed session 26.
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: ERROR   17:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: ERROR   17:24:38 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: ERROR   17:24:38 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: ERROR   17:24:38 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: ERROR   17:24:38 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:24:38 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:24:40 compute-0 podman[207339]: 2025-11-28 17:24:40.269579874 +0000 UTC m=+0.119044810 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 17:24:41 compute-0 podman[207367]: 2025-11-28 17:24:41.198648475 +0000 UTC m=+0.060584467 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 17:24:42 compute-0 podman[207387]: 2025-11-28 17:24:42.20750473 +0000 UTC m=+0.067383174 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7)
Nov 28 17:24:57 compute-0 podman[207409]: 2025-11-28 17:24:57.215209744 +0000 UTC m=+0.070114104 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:24:59 compute-0 podman[197556]: time="2025-11-28T17:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:24:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:24:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2563 "" "Go-http-client/1.1"
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: ERROR   17:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: ERROR   17:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: ERROR   17:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: ERROR   17:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: ERROR   17:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:25:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:25:06 compute-0 podman[207438]: 2025-11-28 17:25:06.194100115 +0000 UTC m=+0.055538651 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:25:11 compute-0 podman[207457]: 2025-11-28 17:25:11.280251945 +0000 UTC m=+0.129165808 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 17:25:11 compute-0 podman[207483]: 2025-11-28 17:25:11.408232818 +0000 UTC m=+0.096672621 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 17:25:13 compute-0 podman[207503]: 2025-11-28 17:25:13.196738945 +0000 UTC m=+0.060169686 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.013 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.014 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.014 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.041 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.041 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.041 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.042 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:18 compute-0 nova_compute[187223]: 2025-11-28 17:25:18.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.886 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.886 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.886 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:25:20 compute-0 nova_compute[187223]: 2025-11-28 17:25:20.887 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.092 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.093 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6158MB free_disk=73.37586975097656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.259 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.259 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.286 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.314 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.315 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:25:21 compute-0 nova_compute[187223]: 2025-11-28 17:25:21.315 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:25:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:27.667 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:25:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:25:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:27.668 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:25:28 compute-0 podman[207525]: 2025-11-28 17:25:28.238167643 +0000 UTC m=+0.097268558 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:25:37 compute-0 podman[207550]: 2025-11-28 17:25:37.227137257 +0000 UTC m=+0.076839773 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 17:25:42 compute-0 podman[207569]: 2025-11-28 17:25:42.20158083 +0000 UTC m=+0.064066171 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:25:42 compute-0 podman[207570]: 2025-11-28 17:25:42.244907553 +0000 UTC m=+0.094312962 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 28 17:25:44 compute-0 podman[207614]: 2025-11-28 17:25:44.202273614 +0000 UTC m=+0.061395242 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=)
Nov 28 17:25:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:58.764 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:25:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:58.765 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:25:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:25:58.767 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:25:59 compute-0 podman[207636]: 2025-11-28 17:25:59.206910931 +0000 UTC m=+0.067933593 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:25:59 compute-0 podman[197556]: time="2025-11-28T17:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:25:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:25:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: ERROR   17:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: ERROR   17:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: ERROR   17:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: ERROR   17:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: ERROR   17:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:26:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:26:08 compute-0 podman[207664]: 2025-11-28 17:26:08.199553642 +0000 UTC m=+0.057890476 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 17:26:13 compute-0 podman[207683]: 2025-11-28 17:26:13.209990817 +0000 UTC m=+0.065845918 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 28 17:26:13 compute-0 podman[207684]: 2025-11-28 17:26:13.275249228 +0000 UTC m=+0.122688324 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 28 17:26:15 compute-0 podman[207725]: 2025-11-28 17:26:15.236661779 +0000 UTC m=+0.090301071 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.312 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.332 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.332 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.332 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.348 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.349 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.349 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.349 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.349 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:26:19 compute-0 nova_compute[187223]: 2025-11-28 17:26:19.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.711 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.844 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.845 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6170MB free_disk=73.3797492980957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.846 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.846 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.943 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.943 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.968 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.989 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.991 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:26:20 compute-0 nova_compute[187223]: 2025-11-28 17:26:20.991 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:26:22 compute-0 nova_compute[187223]: 2025-11-28 17:26:22.987 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:26:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:26:27.669 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:26:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:26:27.669 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:26:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:26:27.670 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:26:29 compute-0 podman[197556]: time="2025-11-28T17:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:26:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:26:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 28 17:26:30 compute-0 podman[207747]: 2025-11-28 17:26:30.205620186 +0000 UTC m=+0.065946920 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: ERROR   17:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: ERROR   17:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: ERROR   17:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: ERROR   17:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: ERROR   17:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:26:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:26:39 compute-0 podman[207772]: 2025-11-28 17:26:39.242988755 +0000 UTC m=+0.083763871 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 17:26:44 compute-0 podman[207791]: 2025-11-28 17:26:44.209591391 +0000 UTC m=+0.065409125 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 17:26:44 compute-0 podman[207792]: 2025-11-28 17:26:44.250885064 +0000 UTC m=+0.102561688 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 17:26:46 compute-0 podman[207835]: 2025-11-28 17:26:46.200245965 +0000 UTC m=+0.059478113 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:26:59 compute-0 podman[197556]: time="2025-11-28T17:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:26:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:26:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2565 "" "Go-http-client/1.1"
Nov 28 17:27:01 compute-0 podman[207856]: 2025-11-28 17:27:01.224861733 +0000 UTC m=+0.070569146 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:27:01 compute-0 anacron[31145]: Job `cron.daily' started
Nov 28 17:27:01 compute-0 anacron[31145]: Job `cron.daily' terminated
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: ERROR   17:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: ERROR   17:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: ERROR   17:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: ERROR   17:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: ERROR   17:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:27:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:27:10 compute-0 podman[207882]: 2025-11-28 17:27:10.223824103 +0000 UTC m=+0.086970218 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 17:27:15 compute-0 podman[207903]: 2025-11-28 17:27:15.212102308 +0000 UTC m=+0.069888417 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 28 17:27:15 compute-0 podman[207904]: 2025-11-28 17:27:15.258982401 +0000 UTC m=+0.106845440 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:27:17 compute-0 podman[207950]: 2025-11-28 17:27:17.19569229 +0000 UTC m=+0.059922345 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.710 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.711 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:27:17 compute-0 nova_compute[187223]: 2025-11-28 17:27:17.732 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:19 compute-0 nova_compute[187223]: 2025-11-28 17:27:19.743 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:19 compute-0 nova_compute[187223]: 2025-11-28 17:27:19.744 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:19 compute-0 nova_compute[187223]: 2025-11-28 17:27:19.745 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.699 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.700 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.701 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:20 compute-0 nova_compute[187223]: 2025-11-28 17:27:20.701 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:21 compute-0 nova_compute[187223]: 2025-11-28 17:27:21.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.711 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.888 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.889 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6179MB free_disk=73.37974548339844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.889 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.890 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.978 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:27:22 compute-0 nova_compute[187223]: 2025-11-28 17:27:22.979 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.021 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.082 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.083 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.103 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.150 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.183 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.197 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.199 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:27:23 compute-0 nova_compute[187223]: 2025-11-28 17:27:23.199 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:27:24 compute-0 nova_compute[187223]: 2025-11-28 17:27:24.194 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:27:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:27:27.669 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:27:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:27:27.670 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:27:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:27:27.670 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:27:29 compute-0 podman[197556]: time="2025-11-28T17:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:27:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:27:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: ERROR   17:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: ERROR   17:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: ERROR   17:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: ERROR   17:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: ERROR   17:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:27:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:27:32 compute-0 podman[207972]: 2025-11-28 17:27:32.192408679 +0000 UTC m=+0.054359843 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:27:41 compute-0 podman[207996]: 2025-11-28 17:27:41.200323386 +0000 UTC m=+0.058849415 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 17:27:46 compute-0 podman[208015]: 2025-11-28 17:27:46.208217455 +0000 UTC m=+0.058001759 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 17:27:46 compute-0 podman[208016]: 2025-11-28 17:27:46.247023362 +0000 UTC m=+0.097254639 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:27:48 compute-0 podman[208061]: 2025-11-28 17:27:48.211244928 +0000 UTC m=+0.070648419 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Nov 28 17:27:59 compute-0 podman[197556]: time="2025-11-28T17:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:27:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:27:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2570 "" "Go-http-client/1.1"
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: ERROR   17:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: ERROR   17:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: ERROR   17:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: ERROR   17:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: ERROR   17:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:28:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:28:03 compute-0 podman[208083]: 2025-11-28 17:28:03.202372783 +0000 UTC m=+0.064954193 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:28:12 compute-0 podman[208107]: 2025-11-28 17:28:12.192571664 +0000 UTC m=+0.056346101 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 17:28:17 compute-0 podman[208127]: 2025-11-28 17:28:17.207696694 +0000 UTC m=+0.064325572 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:28:17 compute-0 podman[208128]: 2025-11-28 17:28:17.22770011 +0000 UTC m=+0.086249664 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 17:28:18 compute-0 nova_compute[187223]: 2025-11-28 17:28:18.677 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:19 compute-0 podman[208170]: 2025-11-28 17:28:19.22268278 +0000 UTC m=+0.086800530 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public)
Nov 28 17:28:20 compute-0 nova_compute[187223]: 2025-11-28 17:28:20.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:20 compute-0 nova_compute[187223]: 2025-11-28 17:28:20.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:20 compute-0 nova_compute[187223]: 2025-11-28 17:28:20.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:28:21 compute-0 nova_compute[187223]: 2025-11-28 17:28:21.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:21 compute-0 nova_compute[187223]: 2025-11-28 17:28:21.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.702 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.703 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.743 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.743 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.743 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.744 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.920 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.921 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6190MB free_disk=73.38011169433594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.922 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.923 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.971 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.971 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:28:22 compute-0 nova_compute[187223]: 2025-11-28 17:28:22.990 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:28:23 compute-0 nova_compute[187223]: 2025-11-28 17:28:23.004 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:28:23 compute-0 nova_compute[187223]: 2025-11-28 17:28:23.006 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:28:23 compute-0 nova_compute[187223]: 2025-11-28 17:28:23.006 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:28:26 compute-0 nova_compute[187223]: 2025-11-28 17:28:26.008 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:28:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:28:27.671 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:28:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:28:27.671 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:28:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:28:27.672 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:28:29 compute-0 podman[197556]: time="2025-11-28T17:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:28:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:28:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: ERROR   17:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: ERROR   17:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: ERROR   17:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: ERROR   17:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: ERROR   17:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:28:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:28:34 compute-0 podman[208191]: 2025-11-28 17:28:34.227138281 +0000 UTC m=+0.086069679 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:28:43 compute-0 podman[208215]: 2025-11-28 17:28:43.202973343 +0000 UTC m=+0.060002527 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 17:28:48 compute-0 podman[208231]: 2025-11-28 17:28:48.226338153 +0000 UTC m=+0.074507621 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:28:48 compute-0 podman[208232]: 2025-11-28 17:28:48.274221874 +0000 UTC m=+0.121933808 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 17:28:50 compute-0 podman[208273]: 2025-11-28 17:28:50.213415701 +0000 UTC m=+0.074713187 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.6)
Nov 28 17:28:59 compute-0 podman[197556]: time="2025-11-28T17:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:28:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:28:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: ERROR   17:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: ERROR   17:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: ERROR   17:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: ERROR   17:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: ERROR   17:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:29:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:29:05 compute-0 podman[208295]: 2025-11-28 17:29:05.201082872 +0000 UTC m=+0.063081817 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:29:14 compute-0 podman[208320]: 2025-11-28 17:29:14.227043332 +0000 UTC m=+0.093363482 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 17:29:19 compute-0 podman[208341]: 2025-11-28 17:29:19.235347517 +0000 UTC m=+0.089669418 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:29:19 compute-0 podman[208342]: 2025-11-28 17:29:19.272497747 +0000 UTC m=+0.112240326 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:29:20 compute-0 nova_compute[187223]: 2025-11-28 17:29:20.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:21 compute-0 podman[208386]: 2025-11-28 17:29:21.209068683 +0000 UTC m=+0.069192470 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, config_id=edpm)
Nov 28 17:29:21 compute-0 nova_compute[187223]: 2025-11-28 17:29:21.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:22 compute-0 nova_compute[187223]: 2025-11-28 17:29:22.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:22 compute-0 nova_compute[187223]: 2025-11-28 17:29:22.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:22 compute-0 nova_compute[187223]: 2025-11-28 17:29:22.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:22 compute-0 nova_compute[187223]: 2025-11-28 17:29:22.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:29:23 compute-0 nova_compute[187223]: 2025-11-28 17:29:23.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.702 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.703 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.724 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.725 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.725 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.725 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.886 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.887 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6194MB free_disk=73.37962341308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.887 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.887 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.970 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:29:24 compute-0 nova_compute[187223]: 2025-11-28 17:29:24.970 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:29:25 compute-0 nova_compute[187223]: 2025-11-28 17:29:25.034 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:29:25 compute-0 nova_compute[187223]: 2025-11-28 17:29:25.058 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:29:25 compute-0 nova_compute[187223]: 2025-11-28 17:29:25.060 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:29:25 compute-0 nova_compute[187223]: 2025-11-28 17:29:25.060 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:29:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:27.673 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:29:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:27.673 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:29:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:27.674 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:29:28 compute-0 nova_compute[187223]: 2025-11-28 17:29:28.056 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:29:29 compute-0 podman[197556]: time="2025-11-28T17:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:29:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:29:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: ERROR   17:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: ERROR   17:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: ERROR   17:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: ERROR   17:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: ERROR   17:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:29:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:29:36 compute-0 podman[208408]: 2025-11-28 17:29:36.236091486 +0000 UTC m=+0.088023658 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:29:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:40.407 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:29:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:40.408 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:29:45 compute-0 podman[208432]: 2025-11-28 17:29:45.192918169 +0000 UTC m=+0.051656871 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 17:29:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:29:46.413 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:29:50 compute-0 podman[208452]: 2025-11-28 17:29:50.207393148 +0000 UTC m=+0.062036147 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:29:50 compute-0 podman[208453]: 2025-11-28 17:29:50.242531519 +0000 UTC m=+0.092037506 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:29:52 compute-0 podman[208496]: 2025-11-28 17:29:52.200735626 +0000 UTC m=+0.063713618 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Nov 28 17:29:59 compute-0 podman[197556]: time="2025-11-28T17:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:29:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:29:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: ERROR   17:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: ERROR   17:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: ERROR   17:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: ERROR   17:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: ERROR   17:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:30:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:30:07 compute-0 podman[208518]: 2025-11-28 17:30:07.195943659 +0000 UTC m=+0.053598418 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:30:16 compute-0 podman[208542]: 2025-11-28 17:30:16.200191588 +0000 UTC m=+0.057944058 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 17:30:21 compute-0 podman[208561]: 2025-11-28 17:30:21.207917939 +0000 UTC m=+0.066682016 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:30:21 compute-0 podman[208562]: 2025-11-28 17:30:21.236148676 +0000 UTC m=+0.092225023 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 17:30:21 compute-0 nova_compute[187223]: 2025-11-28 17:30:21.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:21 compute-0 nova_compute[187223]: 2025-11-28 17:30:21.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:22 compute-0 nova_compute[187223]: 2025-11-28 17:30:22.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:22 compute-0 nova_compute[187223]: 2025-11-28 17:30:22.707 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:23 compute-0 podman[208607]: 2025-11-28 17:30:23.203130272 +0000 UTC m=+0.061718622 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Nov 28 17:30:23 compute-0 nova_compute[187223]: 2025-11-28 17:30:23.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:23 compute-0 nova_compute[187223]: 2025-11-28 17:30:23.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:30:24 compute-0 nova_compute[187223]: 2025-11-28 17:30:24.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:24 compute-0 nova_compute[187223]: 2025-11-28 17:30:24.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.759 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.760 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.760 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.760 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.948 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.950 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6191MB free_disk=73.37962341308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.950 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:25 compute-0 nova_compute[187223]: 2025-11-28 17:30:25.950 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.005 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.006 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.026 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.046 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.048 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:30:26 compute-0 nova_compute[187223]: 2025-11-28 17:30:26.048 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:27 compute-0 nova_compute[187223]: 2025-11-28 17:30:27.049 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:27 compute-0 nova_compute[187223]: 2025-11-28 17:30:27.050 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:30:27 compute-0 nova_compute[187223]: 2025-11-28 17:30:27.050 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:30:27 compute-0 nova_compute[187223]: 2025-11-28 17:30:27.080 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:30:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:27.674 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:27.675 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:27.675 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:29 compute-0 nova_compute[187223]: 2025-11-28 17:30:29.709 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:30:29 compute-0 podman[197556]: time="2025-11-28T17:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:30:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:30:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: ERROR   17:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: ERROR   17:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: ERROR   17:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: ERROR   17:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: ERROR   17:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:30:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:30:38 compute-0 podman[208628]: 2025-11-28 17:30:38.193918428 +0000 UTC m=+0.056741026 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:30:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:41.036 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:30:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:41.037 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:30:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:30:44.039 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:30:47 compute-0 podman[208652]: 2025-11-28 17:30:47.223187441 +0000 UTC m=+0.067694588 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:30:52 compute-0 podman[208671]: 2025-11-28 17:30:52.266425906 +0000 UTC m=+0.062147035 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 17:30:52 compute-0 podman[208672]: 2025-11-28 17:30:52.339118399 +0000 UTC m=+0.137758484 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:30:54 compute-0 podman[208713]: 2025-11-28 17:30:54.20799578 +0000 UTC m=+0.071949862 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.885 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.887 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.903 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.972 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.973 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.979 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:30:54 compute-0 nova_compute[187223]: 2025-11-28 17:30:54.980 187227 INFO nova.compute.claims [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.278 187227 DEBUG nova.compute.provider_tree [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.291 187227 DEBUG nova.scheduler.client.report [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.309 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.310 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.345 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.345 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.363 187227 INFO nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.377 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.448 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.450 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.451 187227 INFO nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Creating image(s)
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.451 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.451 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.452 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.453 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.453 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.860 187227 WARNING oslo_policy.policy [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.861 187227 WARNING oslo_policy.policy [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 28 17:30:55 compute-0 nova_compute[187223]: 2025-11-28 17:30:55.863 187227 DEBUG nova.policy [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f7ca965410e74fcabced6e50aab5d096', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea47683b97094cc99b882a5a1b90949f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:30:56 compute-0 nova_compute[187223]: 2025-11-28 17:30:56.479 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Successfully created port: 5d57189a-27f6-43f1-8c3f-e6c6389babcd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:30:56 compute-0 nova_compute[187223]: 2025-11-28 17:30:56.961 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.029 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.part --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.030 187227 DEBUG nova.virt.images [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] e66bcfff-a835-4b6a-9892-490d158c356a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.032 187227 DEBUG nova.privsep.utils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.033 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.part /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.253 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.part /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.converted" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.263 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.323 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba.converted --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.325 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:57 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.344 187227 INFO oslo.privsep.daemon [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp0t17viqw/privsep.sock']
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.167 187227 INFO oslo.privsep.daemon [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Spawned new privsep daemon via rootwrap
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:57.998 208756 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.002 208756 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.004 208756 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.005 208756 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208756
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.256 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.325 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.327 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.328 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.351 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.409 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.411 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.448 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.450 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.451 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.512 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.515 187227 DEBUG nova.virt.disk.api [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Checking if we can resize image /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.516 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.577 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.579 187227 DEBUG nova.virt.disk.api [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Cannot resize image /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.580 187227 DEBUG nova.objects.instance [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'migration_context' on Instance uuid ea967bd2-166d-4969-ad81-03f2528ed4f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.613 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.614 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Ensure instance console log exists: /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.615 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.615 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.615 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.674 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Successfully updated port: 5d57189a-27f6-43f1-8c3f-e6c6389babcd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.690 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.691 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquired lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.691 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.772 187227 DEBUG nova.compute.manager [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-changed-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.773 187227 DEBUG nova.compute.manager [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Refreshing instance network info cache due to event network-changed-5d57189a-27f6-43f1-8c3f-e6c6389babcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.773 187227 DEBUG oslo_concurrency.lockutils [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:30:58 compute-0 nova_compute[187223]: 2025-11-28 17:30:58.814 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:30:59 compute-0 podman[197556]: time="2025-11-28T17:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:30:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:30:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2578 "" "Go-http-client/1.1"
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.340 187227 DEBUG nova.network.neutron [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updating instance_info_cache with network_info: [{"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.373 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Releasing lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.374 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Instance network_info: |[{"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.375 187227 DEBUG oslo_concurrency.lockutils [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.375 187227 DEBUG nova.network.neutron [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Refreshing network info cache for port 5d57189a-27f6-43f1-8c3f-e6c6389babcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.379 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Start _get_guest_xml network_info=[{"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.386 187227 WARNING nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.391 187227 DEBUG nova.virt.libvirt.host [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.391 187227 DEBUG nova.virt.libvirt.host [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.394 187227 DEBUG nova.virt.libvirt.host [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.395 187227 DEBUG nova.virt.libvirt.host [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.397 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.398 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.398 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.398 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.399 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.399 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.399 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.399 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.399 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.400 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.400 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.400 187227 DEBUG nova.virt.hardware [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.404 187227 DEBUG nova.privsep.utils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.405 187227 DEBUG nova.virt.libvirt.vif [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:30:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-73137176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-73137176',id=2,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-130rbgod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:30:55Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=ea967bd2-166d-4969-ad81-03f2528ed4f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.405 187227 DEBUG nova.network.os_vif_util [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.406 187227 DEBUG nova.network.os_vif_util [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.408 187227 DEBUG nova.objects.instance [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'pci_devices' on Instance uuid ea967bd2-166d-4969-ad81-03f2528ed4f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.424 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <uuid>ea967bd2-166d-4969-ad81-03f2528ed4f5</uuid>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <name>instance-00000002</name>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-73137176</nova:name>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:31:00</nova:creationTime>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:user uuid="f7ca965410e74fcabced6e50aab5d096">tempest-TestExecuteActionsViaActuator-544878882-project-member</nova:user>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:project uuid="ea47683b97094cc99b882a5a1b90949f">tempest-TestExecuteActionsViaActuator-544878882</nova:project>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         <nova:port uuid="5d57189a-27f6-43f1-8c3f-e6c6389babcd">
Nov 28 17:31:00 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <system>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="serial">ea967bd2-166d-4969-ad81-03f2528ed4f5</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="uuid">ea967bd2-166d-4969-ad81-03f2528ed4f5</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </system>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <os>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </os>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <features>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </features>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.config"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:9d:bf:e7"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <target dev="tap5d57189a-27"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/console.log" append="off"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <video>
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </video>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:31:00 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:31:00 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:31:00 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:31:00 compute-0 nova_compute[187223]: </domain>
Nov 28 17:31:00 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.426 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Preparing to wait for external event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.427 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.428 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.428 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.429 187227 DEBUG nova.virt.libvirt.vif [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:30:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-73137176',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-73137176',id=2,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-130rbgod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:30:55Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=ea967bd2-166d-4969-ad81-03f2528ed4f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.429 187227 DEBUG nova.network.os_vif_util [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.430 187227 DEBUG nova.network.os_vif_util [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.430 187227 DEBUG os_vif [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.468 187227 DEBUG ovsdbapp.backend.ovs_idl [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.468 187227 DEBUG ovsdbapp.backend.ovs_idl [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.468 187227 DEBUG ovsdbapp.backend.ovs_idl [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.470 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.471 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.471 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.472 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.474 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.477 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.488 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.489 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.489 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:00 compute-0 nova_compute[187223]: 2025-11-28 17:31:00.490 187227 INFO oslo.privsep.daemon [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp6nkyaatj/privsep.sock']
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.192 187227 INFO oslo.privsep.daemon [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Spawned new privsep daemon via rootwrap
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.047 208777 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.054 208777 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.058 208777 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.058 208777 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208777
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: ERROR   17:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: ERROR   17:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: ERROR   17:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: ERROR   17:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: ERROR   17:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:31:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.522 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.523 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d57189a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.523 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d57189a-27, col_values=(('external_ids', {'iface-id': '5d57189a-27f6-43f1-8c3f-e6c6389babcd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:bf:e7', 'vm-uuid': 'ea967bd2-166d-4969-ad81-03f2528ed4f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.525 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.528 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:31:01 compute-0 NetworkManager[55763]: <info>  [1764351061.5281] manager: (tap5d57189a-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.537 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.538 187227 INFO os_vif [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27')
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.583 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.583 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.583 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No VIF found with MAC fa:16:3e:9d:bf:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:31:01 compute-0 nova_compute[187223]: 2025-11-28 17:31:01.584 187227 INFO nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Using config drive
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.214 187227 INFO nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Creating config drive at /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.config
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.220 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbffhp9c4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.346 187227 DEBUG oslo_concurrency.processutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbffhp9c4" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.372 187227 DEBUG nova.network.neutron [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updated VIF entry in instance network info cache for port 5d57189a-27f6-43f1-8c3f-e6c6389babcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.373 187227 DEBUG nova.network.neutron [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updating instance_info_cache with network_info: [{"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.390 187227 DEBUG oslo_concurrency.lockutils [req-16b0ec0b-4802-40d0-abc0-629c19330896 req-f17d2a76-975a-43cc-828c-24f9076e8860 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:02 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 28 17:31:02 compute-0 kernel: tap5d57189a-27: entered promiscuous mode
Nov 28 17:31:02 compute-0 NetworkManager[55763]: <info>  [1764351062.4619] manager: (tap5d57189a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Nov 28 17:31:02 compute-0 ovn_controller[95574]: 2025-11-28T17:31:02Z|00027|binding|INFO|Claiming lport 5d57189a-27f6-43f1-8c3f-e6c6389babcd for this chassis.
Nov 28 17:31:02 compute-0 ovn_controller[95574]: 2025-11-28T17:31:02Z|00028|binding|INFO|5d57189a-27f6-43f1-8c3f-e6c6389babcd: Claiming fa:16:3e:9d:bf:e7 10.100.0.12
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.463 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.466 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:02.478 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:bf:e7 10.100.0.12'], port_security=['fa:16:3e:9d:bf:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ea967bd2-166d-4969-ad81-03f2528ed4f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=5d57189a-27f6-43f1-8c3f-e6c6389babcd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:31:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:02.480 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 5d57189a-27f6-43f1-8c3f-e6c6389babcd in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 bound to our chassis
Nov 28 17:31:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:02.485 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:31:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:02.487 104433 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp6hkb0kcx/privsep.sock']
Nov 28 17:31:02 compute-0 systemd-udevd[208801]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:31:02 compute-0 NetworkManager[55763]: <info>  [1764351062.5240] device (tap5d57189a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:31:02 compute-0 NetworkManager[55763]: <info>  [1764351062.5253] device (tap5d57189a-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.536 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:02 compute-0 ovn_controller[95574]: 2025-11-28T17:31:02Z|00029|binding|INFO|Setting lport 5d57189a-27f6-43f1-8c3f-e6c6389babcd ovn-installed in OVS
Nov 28 17:31:02 compute-0 ovn_controller[95574]: 2025-11-28T17:31:02Z|00030|binding|INFO|Setting lport 5d57189a-27f6-43f1-8c3f-e6c6389babcd up in Southbound
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.545 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:02 compute-0 systemd-machined[153517]: New machine qemu-1-instance-00000002.
Nov 28 17:31:02 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.862 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.982 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351062.9817228, ea967bd2-166d-4969-ad81-03f2528ed4f5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:02 compute-0 nova_compute[187223]: 2025-11-28 17:31:02.983 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] VM Started (Lifecycle Event)
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.019 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.028 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351062.9818745, ea967bd2-166d-4969-ad81-03f2528ed4f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.029 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] VM Paused (Lifecycle Event)
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.047 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.051 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.071 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.197 187227 DEBUG nova.compute.manager [req-ea5426f5-e42d-4dab-b98c-9628ebd5a7e6 req-5898b726-588d-431e-8cf7-9b3d4954283f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.197 187227 DEBUG oslo_concurrency.lockutils [req-ea5426f5-e42d-4dab-b98c-9628ebd5a7e6 req-5898b726-588d-431e-8cf7-9b3d4954283f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.199 187227 DEBUG oslo_concurrency.lockutils [req-ea5426f5-e42d-4dab-b98c-9628ebd5a7e6 req-5898b726-588d-431e-8cf7-9b3d4954283f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.200 187227 DEBUG oslo_concurrency.lockutils [req-ea5426f5-e42d-4dab-b98c-9628ebd5a7e6 req-5898b726-588d-431e-8cf7-9b3d4954283f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.200 187227 DEBUG nova.compute.manager [req-ea5426f5-e42d-4dab-b98c-9628ebd5a7e6 req-5898b726-588d-431e-8cf7-9b3d4954283f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Processing event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.201 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.205 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351063.2048824, ea967bd2-166d-4969-ad81-03f2528ed4f5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.205 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] VM Resumed (Lifecycle Event)
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.207 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.217 187227 INFO nova.virt.libvirt.driver [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Instance spawned successfully.
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.218 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.226 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.227 104433 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.228 104433 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6hkb0kcx/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.081 208826 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.085 208826 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.088 208826 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.088 208826 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208826
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.234 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[28b5a94e-cfa4-459c-bf97-dd7811422c6f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.240 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.244 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.245 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.246 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.246 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.247 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.247 187227 DEBUG nova.virt.libvirt.driver [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.272 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.351 187227 INFO nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Took 7.90 seconds to spawn the instance on the hypervisor.
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.352 187227 DEBUG nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.411 187227 INFO nova.compute.manager [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Took 8.47 seconds to build instance.
Nov 28 17:31:03 compute-0 nova_compute[187223]: 2025-11-28 17:31:03.430 187227 DEBUG oslo_concurrency.lockutils [None req-78aefed8-a50b-435e-b8c1-8f79f7dc487e f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.775 208826 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.775 208826 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:03 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:03.776 208826 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.346 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4101bf24-aac3-46d6-967b-16dd3d1358b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.347 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap015f34bb-51 in ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.349 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap015f34bb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.349 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4f68cf1b-77aa-4a70-ab38-4e8041613b75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.352 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[437a97af-33d2-4e8d-b9ea-ff4874343946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.383 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[107a447e-2629-4bd6-929e-5462bf96bb88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.400 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[051d525f-6ee6-43eb-af91-a6d812418545]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:04 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:04.403 104433 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpfqhzhenm/privsep.sock']
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.139 104433 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.140 104433 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfqhzhenm/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.014 208840 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.022 208840 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.026 208840 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.026 208840 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208840
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.144 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa67318-237e-4905-86b5-6a4a76ea0f17]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.559 187227 DEBUG nova.compute.manager [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.559 187227 DEBUG oslo_concurrency.lockutils [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.559 187227 DEBUG oslo_concurrency.lockutils [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.559 187227 DEBUG oslo_concurrency.lockutils [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.560 187227 DEBUG nova.compute.manager [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] No waiting events found dispatching network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:05 compute-0 nova_compute[187223]: 2025-11-28 17:31:05.560 187227 WARNING nova.compute.manager [req-038c0df9-f832-491a-9929-c20f8f157b2b req-74119f89-7da9-4740-9420-8bcdfba01cf7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received unexpected event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd for instance with vm_state active and task_state None.
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.765 208840 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.765 208840 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:05 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:05.765 208840 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.379 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[6185e15b-43eb-4286-976b-9d9bb373b166]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 NetworkManager[55763]: <info>  [1764351066.4994] manager: (tap015f34bb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.498 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a24d1b06-735a-4da2-bb2e-084e876f1847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 nova_compute[187223]: 2025-11-28 17:31:06.527 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.537 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e3ad9c-4ed2-4161-ab56-1fbc6c63dd9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.541 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[2c33d3a9-bcc2-4384-adaf-1583b9ebda13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 systemd-udevd[208855]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:31:06 compute-0 NetworkManager[55763]: <info>  [1764351066.5687] device (tap015f34bb-50): carrier: link connected
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.576 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d63efb94-8025-4669-83ff-038f39d82890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.606 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7bd00c-0e99-488a-8209-dacaf29f9cab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208859, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.627 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6aa04058-fa8d-48a8-bc5e-83837644a203]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:3c9a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424119, 'tstamp': 424119}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208872, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.650 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6265b118-aeeb-4161-8924-bc9673f11694]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208874, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.710 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[64d4b6b6-e52c-4e31-9c07-65a72fd99e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.779 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[12ec2212-d0fe-4b55-bc4d-bfd829fff5e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.781 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.782 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.783 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:06 compute-0 nova_compute[187223]: 2025-11-28 17:31:06.785 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:06 compute-0 kernel: tap015f34bb-50: entered promiscuous mode
Nov 28 17:31:06 compute-0 NetworkManager[55763]: <info>  [1764351066.7873] manager: (tap015f34bb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Nov 28 17:31:06 compute-0 nova_compute[187223]: 2025-11-28 17:31:06.789 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.791 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:06 compute-0 nova_compute[187223]: 2025-11-28 17:31:06.792 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:06 compute-0 ovn_controller[95574]: 2025-11-28T17:31:06Z|00031|binding|INFO|Releasing lport 2de820a4-0104-4404-a104-bd64f5ebe5e5 from this chassis (sb_readonly=0)
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.794 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/015f34bb-5da1-42eb-bab2-066f32a46dd5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/015f34bb-5da1-42eb-bab2-066f32a46dd5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.796 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1657230d-71da-4a35-a6c0-d1b3c7d5e671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.797 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/015f34bb-5da1-42eb-bab2-066f32a46dd5.pid.haproxy
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:31:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:06.798 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'env', 'PROCESS_TAG=haproxy-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/015f34bb-5da1-42eb-bab2-066f32a46dd5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:31:06 compute-0 nova_compute[187223]: 2025-11-28 17:31:06.804 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:07 compute-0 podman[208907]: 2025-11-28 17:31:07.224178814 +0000 UTC m=+0.066702589 container create 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 17:31:07 compute-0 systemd[1]: Started libpod-conmon-88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1.scope.
Nov 28 17:31:07 compute-0 podman[208907]: 2025-11-28 17:31:07.194626316 +0000 UTC m=+0.037150111 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:31:07 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:31:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f69dfa19d87e1ff8d2c78c2d3813f2d69f5299be571fa11735ce35cd0a66f638/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:31:07 compute-0 podman[208907]: 2025-11-28 17:31:07.332185244 +0000 UTC m=+0.174709019 container init 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:31:07 compute-0 podman[208907]: 2025-11-28 17:31:07.340381684 +0000 UTC m=+0.182905459 container start 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 17:31:07 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [NOTICE]   (208925) : New worker (208927) forked
Nov 28 17:31:07 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [NOTICE]   (208925) : Loading success.
Nov 28 17:31:07 compute-0 nova_compute[187223]: 2025-11-28 17:31:07.866 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:09 compute-0 podman[208936]: 2025-11-28 17:31:09.223867452 +0000 UTC m=+0.082332018 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:31:11 compute-0 nova_compute[187223]: 2025-11-28 17:31:11.531 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:12 compute-0 nova_compute[187223]: 2025-11-28 17:31:12.870 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.654 187227 DEBUG nova.compute.manager [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.752 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.753 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.786 187227 DEBUG nova.objects.instance [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.810 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.811 187227 INFO nova.compute.claims [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.812 187227 DEBUG nova.objects.instance [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'resources' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:15 compute-0 nova_compute[187223]: 2025-11-28 17:31:15.826 187227 DEBUG nova.objects.instance [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.057 187227 INFO nova.compute.resource_tracker [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating resource usage from migration 62a97eae-dd4a-4d2a-bd90-dac4f9f1baec
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.059 187227 DEBUG nova.compute.resource_tracker [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Starting to track incoming migration 62a97eae-dd4a-4d2a-bd90-dac4f9f1baec with flavor 0f80ec62-dee8-4253-8ca9-0848bcaf92f4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.209 187227 DEBUG nova.compute.provider_tree [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.379 187227 ERROR nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [req-1d3e7770-7223-470c-869e-2278fb2d92d0] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-1d3e7770-7223-470c-869e-2278fb2d92d0"}]}
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.402 187227 DEBUG nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.420 187227 DEBUG nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.421 187227 DEBUG nova.compute.provider_tree [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.436 187227 DEBUG nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:31:16 compute-0 ovn_controller[95574]: 2025-11-28T17:31:16Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9d:bf:e7 10.100.0.12
Nov 28 17:31:16 compute-0 ovn_controller[95574]: 2025-11-28T17:31:16Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9d:bf:e7 10.100.0.12
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.466 187227 DEBUG nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.516 187227 DEBUG nova.compute.provider_tree [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.537 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.575 187227 DEBUG nova.scheduler.client.report [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updated inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.576 187227 DEBUG nova.compute.provider_tree [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.576 187227 DEBUG nova.compute.provider_tree [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.596 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.596 187227 INFO nova.compute.manager [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Migrating
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.596 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.597 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.604 187227 INFO nova.compute.rpcapi [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Nov 28 17:31:16 compute-0 nova_compute[187223]: 2025-11-28 17:31:16.605 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:17 compute-0 nova_compute[187223]: 2025-11-28 17:31:17.914 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:18 compute-0 podman[208987]: 2025-11-28 17:31:18.246201131 +0000 UTC m=+0.088055636 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 17:31:18 compute-0 sshd-session[209008]: Accepted publickey for nova from 192.168.122.101 port 33090 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:31:18 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:31:18 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:31:18 compute-0 systemd-logind[788]: New session 27 of user nova.
Nov 28 17:31:18 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:31:18 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:31:18 compute-0 systemd[209012]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:19 compute-0 systemd[209012]: Queued start job for default target Main User Target.
Nov 28 17:31:19 compute-0 systemd[209012]: Created slice User Application Slice.
Nov 28 17:31:19 compute-0 systemd[209012]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:31:19 compute-0 systemd[209012]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:31:19 compute-0 systemd[209012]: Reached target Paths.
Nov 28 17:31:19 compute-0 systemd[209012]: Reached target Timers.
Nov 28 17:31:19 compute-0 systemd[209012]: Starting D-Bus User Message Bus Socket...
Nov 28 17:31:19 compute-0 systemd[209012]: Starting Create User's Volatile Files and Directories...
Nov 28 17:31:19 compute-0 systemd[209012]: Finished Create User's Volatile Files and Directories.
Nov 28 17:31:19 compute-0 systemd[209012]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:31:19 compute-0 systemd[209012]: Reached target Sockets.
Nov 28 17:31:19 compute-0 systemd[209012]: Reached target Basic System.
Nov 28 17:31:19 compute-0 systemd[209012]: Reached target Main User Target.
Nov 28 17:31:19 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:31:19 compute-0 systemd[209012]: Startup finished in 150ms.
Nov 28 17:31:19 compute-0 systemd[1]: Started Session 27 of User nova.
Nov 28 17:31:19 compute-0 sshd-session[209008]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:19 compute-0 sshd-session[209029]: Received disconnect from 192.168.122.101 port 33090:11: disconnected by user
Nov 28 17:31:19 compute-0 sshd-session[209029]: Disconnected from user nova 192.168.122.101 port 33090
Nov 28 17:31:19 compute-0 sshd-session[209008]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:31:19 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 28 17:31:19 compute-0 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Nov 28 17:31:19 compute-0 systemd-logind[788]: Removed session 27.
Nov 28 17:31:19 compute-0 sshd-session[209031]: Accepted publickey for nova from 192.168.122.101 port 33104 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:31:19 compute-0 systemd-logind[788]: New session 29 of user nova.
Nov 28 17:31:19 compute-0 systemd[1]: Started Session 29 of User nova.
Nov 28 17:31:19 compute-0 sshd-session[209031]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:19 compute-0 sshd-session[209034]: Received disconnect from 192.168.122.101 port 33104:11: disconnected by user
Nov 28 17:31:19 compute-0 sshd-session[209034]: Disconnected from user nova 192.168.122.101 port 33104
Nov 28 17:31:19 compute-0 sshd-session[209031]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:31:19 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Nov 28 17:31:19 compute-0 systemd-logind[788]: Session 29 logged out. Waiting for processes to exit.
Nov 28 17:31:19 compute-0 systemd-logind[788]: Removed session 29.
Nov 28 17:31:21 compute-0 nova_compute[187223]: 2025-11-28 17:31:21.542 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.202 187227 DEBUG nova.compute.manager [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.203 187227 DEBUG oslo_concurrency.lockutils [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.204 187227 DEBUG oslo_concurrency.lockutils [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.204 187227 DEBUG oslo_concurrency.lockutils [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.204 187227 DEBUG nova.compute.manager [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.204 187227 WARNING nova.compute.manager [req-bbafeb24-6e96-420d-8883-af4d3dd33ab5 req-792dae67-61be-4465-983f-5a57de10f58c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received unexpected event network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with vm_state active and task_state resize_migrating.
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:22 compute-0 sshd-session[209036]: Accepted publickey for nova from 192.168.122.101 port 33120 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:31:22 compute-0 systemd-logind[788]: New session 30 of user nova.
Nov 28 17:31:22 compute-0 systemd[1]: Started Session 30 of User nova.
Nov 28 17:31:22 compute-0 sshd-session[209036]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:22 compute-0 podman[209038]: 2025-11-28 17:31:22.798089556 +0000 UTC m=+0.088101477 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 28 17:31:22 compute-0 podman[209040]: 2025-11-28 17:31:22.833554887 +0000 UTC m=+0.123844996 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 17:31:22 compute-0 nova_compute[187223]: 2025-11-28 17:31:22.916 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:23 compute-0 sshd-session[209051]: Received disconnect from 192.168.122.101 port 33120:11: disconnected by user
Nov 28 17:31:23 compute-0 sshd-session[209051]: Disconnected from user nova 192.168.122.101 port 33120
Nov 28 17:31:23 compute-0 sshd-session[209036]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:31:23 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Removed session 30.
Nov 28 17:31:23 compute-0 sshd-session[209087]: Accepted publickey for nova from 192.168.122.101 port 35540 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:31:23 compute-0 systemd-logind[788]: New session 31 of user nova.
Nov 28 17:31:23 compute-0 systemd[1]: Started Session 31 of User nova.
Nov 28 17:31:23 compute-0 sshd-session[209087]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:23 compute-0 sshd-session[209090]: Received disconnect from 192.168.122.101 port 35540:11: disconnected by user
Nov 28 17:31:23 compute-0 sshd-session[209090]: Disconnected from user nova 192.168.122.101 port 35540
Nov 28 17:31:23 compute-0 sshd-session[209087]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:31:23 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Removed session 31.
Nov 28 17:31:23 compute-0 sshd-session[209092]: Accepted publickey for nova from 192.168.122.101 port 35556 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:31:23 compute-0 systemd-logind[788]: New session 32 of user nova.
Nov 28 17:31:23 compute-0 systemd[1]: Started Session 32 of User nova.
Nov 28 17:31:23 compute-0 sshd-session[209092]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:31:23 compute-0 nova_compute[187223]: 2025-11-28 17:31:23.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:23 compute-0 sshd-session[209095]: Received disconnect from 192.168.122.101 port 35556:11: disconnected by user
Nov 28 17:31:23 compute-0 sshd-session[209095]: Disconnected from user nova 192.168.122.101 port 35556
Nov 28 17:31:23 compute-0 sshd-session[209092]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:31:23 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Nov 28 17:31:23 compute-0 systemd-logind[788]: Removed session 32.
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.356 187227 DEBUG nova.compute.manager [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.358 187227 DEBUG oslo_concurrency.lockutils [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.358 187227 DEBUG oslo_concurrency.lockutils [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.358 187227 DEBUG oslo_concurrency.lockutils [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.359 187227 DEBUG nova.compute.manager [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.359 187227 WARNING nova.compute.manager [req-bdeae4a6-d083-4441-b358-07cb676089dd req-412e5fed-b1f6-42ed-b486-013e1a528557 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received unexpected event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with vm_state active and task_state resize_migrated.
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.380 187227 INFO nova.network.neutron [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating port 17cde755-7c4b-416f-b8da-a328887f9819 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Nov 28 17:31:24 compute-0 nova_compute[187223]: 2025-11-28 17:31:24.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:25 compute-0 podman[209097]: 2025-11-28 17:31:25.24669861 +0000 UTC m=+0.093639399 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.709 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.809 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.810 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.811 187227 DEBUG nova.network.neutron [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.820 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.886 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.887 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.923 187227 DEBUG nova.compute.manager [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-changed-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.924 187227 DEBUG nova.compute.manager [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Refreshing instance network info cache due to event network-changed-17cde755-7c4b-416f-b8da-a328887f9819. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.924 187227 DEBUG oslo_concurrency.lockutils [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:25 compute-0 nova_compute[187223]: 2025-11-28 17:31:25.960 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.117 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.118 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5661MB free_disk=73.28742599487305GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.119 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.119 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.190 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Applying migration context for instance 8ce9d497-2a8a-4c42-b93f-5778740cbc9b as it has an incoming, in-progress migration 62a97eae-dd4a-4d2a-bd90-dac4f9f1baec. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.191 187227 INFO nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating resource usage from migration 62a97eae-dd4a-4d2a-bd90-dac4f9f1baec
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.226 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance ea967bd2-166d-4969-ad81-03f2528ed4f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.227 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 8ce9d497-2a8a-4c42-b93f-5778740cbc9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.227 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.227 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.291 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.308 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.330 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.331 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:26 compute-0 nova_compute[187223]: 2025-11-28 17:31:26.548 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.331 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.525 187227 DEBUG nova.network.neutron [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating instance_info_cache with network_info: [{"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.571 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.577 187227 DEBUG oslo_concurrency.lockutils [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.577 187227 DEBUG nova.network.neutron [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Refreshing network info cache for port 17cde755-7c4b-416f-b8da-a328887f9819 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:31:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:27.676 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:27.677 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:27.677 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.679 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.682 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.682 187227 INFO nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Creating image(s)
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.683 187227 DEBUG nova.objects.instance [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.728 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.821 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.822 187227 DEBUG nova.virt.disk.api [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.822 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.891 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.892 187227 DEBUG nova.virt.disk.api [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.905 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.905 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Ensure instance console log exists: /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.906 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.906 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.906 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.909 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Start _get_guest_xml network_info=[{"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "vif_mac": "fa:16:3e:4e:32:1a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.914 187227 WARNING nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.929 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.932 187227 DEBUG nova.virt.libvirt.host [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.932 187227 DEBUG nova.virt.libvirt.host [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.938 187227 DEBUG nova.virt.libvirt.host [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.939 187227 DEBUG nova.virt.libvirt.host [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.940 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.941 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='0f80ec62-dee8-4253-8ca9-0848bcaf92f4',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.941 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.941 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.942 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.942 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.942 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.943 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.943 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.943 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.943 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.944 187227 DEBUG nova.virt.hardware [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.944 187227 DEBUG nova.objects.instance [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:27 compute-0 nova_compute[187223]: 2025-11-28 17:31:27.962 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.023 187227 DEBUG oslo_concurrency.processutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.config --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.025 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.025 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.027 187227 DEBUG oslo_concurrency.lockutils [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.029 187227 DEBUG nova.virt.libvirt.vif [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415079104',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415079104',id=1,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:30:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-y95xlee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:31:24Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=8ce9d497-2a8a-4c42-b93f-5778740cbc9b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "vif_mac": "fa:16:3e:4e:32:1a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.029 187227 DEBUG nova.network.os_vif_util [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "vif_mac": "fa:16:3e:4e:32:1a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.031 187227 DEBUG nova.network.os_vif_util [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.035 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <uuid>8ce9d497-2a8a-4c42-b93f-5778740cbc9b</uuid>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <name>instance-00000001</name>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <memory>196608</memory>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-415079104</nova:name>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:31:27</nova:creationTime>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:flavor name="m1.micro">
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:memory>192</nova:memory>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:user uuid="f7ca965410e74fcabced6e50aab5d096">tempest-TestExecuteActionsViaActuator-544878882-project-member</nova:user>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:project uuid="ea47683b97094cc99b882a5a1b90949f">tempest-TestExecuteActionsViaActuator-544878882</nova:project>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         <nova:port uuid="17cde755-7c4b-416f-b8da-a328887f9819">
Nov 28 17:31:28 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <system>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="serial">8ce9d497-2a8a-4c42-b93f-5778740cbc9b</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="uuid">8ce9d497-2a8a-4c42-b93f-5778740cbc9b</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </system>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <os>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </os>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <features>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </features>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk.config"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:4e:32:1a"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <target dev="tap17cde755-7c"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/console.log" append="off"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <video>
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </video>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:31:28 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:31:28 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:31:28 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:31:28 compute-0 nova_compute[187223]: </domain>
Nov 28 17:31:28 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.038 187227 DEBUG nova.virt.libvirt.vif [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415079104',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415079104',id=1,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:30:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-y95xlee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:31:24Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=8ce9d497-2a8a-4c42-b93f-5778740cbc9b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "vif_mac": "fa:16:3e:4e:32:1a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.039 187227 DEBUG nova.network.os_vif_util [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "vif_mac": "fa:16:3e:4e:32:1a"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.040 187227 DEBUG nova.network.os_vif_util [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.041 187227 DEBUG os_vif [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.043 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.044 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.045 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.050 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.050 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17cde755-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.051 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap17cde755-7c, col_values=(('external_ids', {'iface-id': '17cde755-7c4b-416f-b8da-a328887f9819', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:32:1a', 'vm-uuid': '8ce9d497-2a8a-4c42-b93f-5778740cbc9b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.054 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 NetworkManager[55763]: <info>  [1764351088.0575] manager: (tap17cde755-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.059 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.065 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.067 187227 INFO os_vif [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c')
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.287 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.287 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.289 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No VIF found with MAC fa:16:3e:4e:32:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.290 187227 INFO nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Using config drive
Nov 28 17:31:28 compute-0 kernel: tap17cde755-7c: entered promiscuous mode
Nov 28 17:31:28 compute-0 NetworkManager[55763]: <info>  [1764351088.3743] manager: (tap17cde755-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.376 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 ovn_controller[95574]: 2025-11-28T17:31:28Z|00032|binding|INFO|Claiming lport 17cde755-7c4b-416f-b8da-a328887f9819 for this chassis.
Nov 28 17:31:28 compute-0 ovn_controller[95574]: 2025-11-28T17:31:28Z|00033|binding|INFO|17cde755-7c4b-416f-b8da-a328887f9819: Claiming fa:16:3e:4e:32:1a 10.100.0.8
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.390 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.390 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:32:1a 10.100.0.8'], port_security=['fa:16:3e:4e:32:1a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ce9d497-2a8a-4c42-b93f-5778740cbc9b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=17cde755-7c4b-416f-b8da-a328887f9819) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:31:28 compute-0 ovn_controller[95574]: 2025-11-28T17:31:28Z|00034|binding|INFO|Setting lport 17cde755-7c4b-416f-b8da-a328887f9819 up in Southbound
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.392 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 17cde755-7c4b-416f-b8da-a328887f9819 in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 bound to our chassis
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.392 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 ovn_controller[95574]: 2025-11-28T17:31:28Z|00035|binding|INFO|Setting lport 17cde755-7c4b-416f-b8da-a328887f9819 ovn-installed in OVS
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.394 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.395 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.400 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.414 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[af9e48cc-142f-4486-9e62-23a5c8373104]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 systemd-udevd[209150]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:31:28 compute-0 systemd-machined[153517]: New machine qemu-2-instance-00000001.
Nov 28 17:31:28 compute-0 NetworkManager[55763]: <info>  [1764351088.4357] device (tap17cde755-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:31:28 compute-0 NetworkManager[55763]: <info>  [1764351088.4368] device (tap17cde755-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:31:28 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.449 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[6d6dd793-7d53-4436-b5b0-355f22dfb223]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.454 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[241df680-8adf-469d-bd66-574fda92f646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.489 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[4badbad3-a4f9-42c3-9b2b-b625bc118c8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.509 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[62a3cb4e-0f8b-4797-9adc-7460e22a10f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209163, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.532 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[580dbf72-ae8a-431f-9439-4c82efa81c13]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209164, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209164, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.534 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.536 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.538 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.538 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.538 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:28.539 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.646 187227 DEBUG nova.compute.manager [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.646 187227 DEBUG oslo_concurrency.lockutils [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.646 187227 DEBUG oslo_concurrency.lockutils [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.647 187227 DEBUG oslo_concurrency.lockutils [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.647 187227 DEBUG nova.compute.manager [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.647 187227 WARNING nova.compute.manager [req-97478456-5b7a-4c25-b035-63fd071be217 req-a3932e4a-b42f-45a2-8724-aecddcd2ddea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received unexpected event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with vm_state active and task_state resize_finish.
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:31:28 compute-0 nova_compute[187223]: 2025-11-28 17:31:28.702 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.005 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351089.0050378, 8ce9d497-2a8a-4c42-b93f-5778740cbc9b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.006 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] VM Resumed (Lifecycle Event)
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.009 187227 DEBUG nova.compute.manager [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.015 187227 INFO nova.virt.libvirt.driver [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Instance running successfully.
Nov 28 17:31:29 compute-0 virtqemud[186845]: argument unsupported: QEMU guest agent is not configured
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.018 187227 DEBUG nova.virt.libvirt.guest [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.019 187227 DEBUG nova.virt.libvirt.driver [None req-8ad6455d-16de-43d1-b357-759e79ad394b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.023 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.027 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.046 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] During sync_power_state the instance has a pending task (resize_finish). Skip.
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.047 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351089.0087416, 8ce9d497-2a8a-4c42-b93f-5778740cbc9b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.047 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] VM Started (Lifecycle Event)
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.078 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.082 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.411 187227 DEBUG nova.network.neutron [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updated VIF entry in instance network info cache for port 17cde755-7c4b-416f-b8da-a328887f9819. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.411 187227 DEBUG nova.network.neutron [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating instance_info_cache with network_info: [{"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.430 187227 DEBUG oslo_concurrency.lockutils [req-0130ec1c-e4a4-4410-afd4-bc7f01934870 req-c6e8c8c1-0850-4700-a3af-24f7320f57f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.431 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.432 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:31:29 compute-0 nova_compute[187223]: 2025-11-28 17:31:29.432 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:29 compute-0 podman[197556]: time="2025-11-28T17:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:31:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:31:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.541 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating instance_info_cache with network_info: [{"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.569 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-8ce9d497-2a8a-4c42-b93f-5778740cbc9b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.570 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.743 187227 DEBUG nova.compute.manager [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.743 187227 DEBUG oslo_concurrency.lockutils [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.744 187227 DEBUG oslo_concurrency.lockutils [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.744 187227 DEBUG oslo_concurrency.lockutils [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.745 187227 DEBUG nova.compute.manager [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:30 compute-0 nova_compute[187223]: 2025-11-28 17:31:30.745 187227 WARNING nova.compute.manager [req-ccd4f572-8c09-4dcc-8eba-9a12f37f11c2 req-e9e04ebe-867c-49e6-9184-e3d976ee187f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received unexpected event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with vm_state resized and task_state None.
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: ERROR   17:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: ERROR   17:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: ERROR   17:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: ERROR   17:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: ERROR   17:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:31:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:31:32 compute-0 nova_compute[187223]: 2025-11-28 17:31:32.565 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:31:32 compute-0 nova_compute[187223]: 2025-11-28 17:31:32.931 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:33 compute-0 nova_compute[187223]: 2025-11-28 17:31:33.055 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:33 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:31:33 compute-0 systemd[209012]: Activating special unit Exit the Session...
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped target Main User Target.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped target Basic System.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped target Paths.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped target Sockets.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped target Timers.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:31:33 compute-0 systemd[209012]: Closed D-Bus User Message Bus Socket.
Nov 28 17:31:33 compute-0 systemd[209012]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:31:33 compute-0 systemd[209012]: Removed slice User Application Slice.
Nov 28 17:31:33 compute-0 systemd[209012]: Reached target Shutdown.
Nov 28 17:31:33 compute-0 systemd[209012]: Finished Exit the Session.
Nov 28 17:31:33 compute-0 systemd[209012]: Reached target Exit the Session.
Nov 28 17:31:33 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:31:33 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:31:33 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:31:33 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:31:33 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:31:33 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:31:33 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:31:37 compute-0 nova_compute[187223]: 2025-11-28 17:31:37.936 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:38 compute-0 nova_compute[187223]: 2025-11-28 17:31:38.058 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:40 compute-0 podman[209175]: 2025-11-28 17:31:40.224031183 +0000 UTC m=+0.072230951 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:31:41 compute-0 ovn_controller[95574]: 2025-11-28T17:31:41Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:32:1a 10.100.0.8
Nov 28 17:31:42 compute-0 nova_compute[187223]: 2025-11-28 17:31:42.938 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:43 compute-0 nova_compute[187223]: 2025-11-28 17:31:43.060 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.305 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.305 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.327 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.445 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.446 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.454 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.454 187227 INFO nova.compute.claims [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.591 187227 DEBUG nova.compute.provider_tree [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.604 187227 DEBUG nova.scheduler.client.report [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.632 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.634 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.679 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.680 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.697 187227 INFO nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.737 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.967 187227 DEBUG nova.policy [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f7ca965410e74fcabced6e50aab5d096', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea47683b97094cc99b882a5a1b90949f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:31:47 compute-0 nova_compute[187223]: 2025-11-28 17:31:47.969 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.020 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.023 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.025 187227 INFO nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Creating image(s)
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.026 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.027 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.028 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.044 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.075 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.127 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.128 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.129 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.140 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.199 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.201 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.240 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.242 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.243 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.303 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.305 187227 DEBUG nova.virt.disk.api [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Checking if we can resize image /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.305 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.363 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.364 187227 DEBUG nova.virt.disk.api [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Cannot resize image /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.364 187227 DEBUG nova.objects.instance [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'migration_context' on Instance uuid 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.379 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.379 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Ensure instance console log exists: /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.380 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.380 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:48 compute-0 nova_compute[187223]: 2025-11-28 17:31:48.381 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:49 compute-0 podman[209224]: 2025-11-28 17:31:49.248040081 +0000 UTC m=+0.083843932 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:31:50 compute-0 nova_compute[187223]: 2025-11-28 17:31:50.016 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:50 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:50.015 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:31:50 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:50.020 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:31:50 compute-0 nova_compute[187223]: 2025-11-28 17:31:50.123 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Successfully created port: fd926b88-53a3-4c34-aaa1-2957370ba65a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.588 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Successfully updated port: fd926b88-53a3-4c34-aaa1-2957370ba65a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.606 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.607 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquired lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.607 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.694 187227 DEBUG nova.compute.manager [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-changed-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.695 187227 DEBUG nova.compute.manager [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Refreshing instance network info cache due to event network-changed-fd926b88-53a3-4c34-aaa1-2957370ba65a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.695 187227 DEBUG oslo_concurrency.lockutils [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:31:51 compute-0 nova_compute[187223]: 2025-11-28 17:31:51.765 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:52.999 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.077 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 podman[209243]: 2025-11-28 17:31:53.242704251 +0000 UTC m=+0.101322124 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:31:53 compute-0 podman[209244]: 2025-11-28 17:31:53.270361873 +0000 UTC m=+0.124823674 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.270 187227 DEBUG nova.network.neutron [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Updating instance_info_cache with network_info: [{"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.288 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Releasing lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.289 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Instance network_info: |[{"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.289 187227 DEBUG oslo_concurrency.lockutils [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.289 187227 DEBUG nova.network.neutron [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Refreshing network info cache for port fd926b88-53a3-4c34-aaa1-2957370ba65a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.292 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Start _get_guest_xml network_info=[{"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.298 187227 WARNING nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.305 187227 DEBUG nova.virt.libvirt.host [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.306 187227 DEBUG nova.virt.libvirt.host [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.314 187227 DEBUG nova.virt.libvirt.host [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.315 187227 DEBUG nova.virt.libvirt.host [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.316 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.316 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.317 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.318 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.318 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.318 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.318 187227 DEBUG nova.virt.hardware [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.321 187227 DEBUG nova.virt.libvirt.vif [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:31:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-413330110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-413330110',id=4,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-6watjtec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:31:47Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=11258d07-82c4-4bf7-9965-9bd6fa9f6a10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.322 187227 DEBUG nova.network.os_vif_util [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.322 187227 DEBUG nova.network.os_vif_util [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.323 187227 DEBUG nova.objects.instance [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'pci_devices' on Instance uuid 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.339 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <uuid>11258d07-82c4-4bf7-9965-9bd6fa9f6a10</uuid>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <name>instance-00000004</name>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-413330110</nova:name>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:31:53</nova:creationTime>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:user uuid="f7ca965410e74fcabced6e50aab5d096">tempest-TestExecuteActionsViaActuator-544878882-project-member</nova:user>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:project uuid="ea47683b97094cc99b882a5a1b90949f">tempest-TestExecuteActionsViaActuator-544878882</nova:project>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         <nova:port uuid="fd926b88-53a3-4c34-aaa1-2957370ba65a">
Nov 28 17:31:53 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <system>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="serial">11258d07-82c4-4bf7-9965-9bd6fa9f6a10</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="uuid">11258d07-82c4-4bf7-9965-9bd6fa9f6a10</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </system>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <os>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </os>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <features>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </features>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.config"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:4a:40:ab"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <target dev="tapfd926b88-53"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/console.log" append="off"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <video>
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </video>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:31:53 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:31:53 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:31:53 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:31:53 compute-0 nova_compute[187223]: </domain>
Nov 28 17:31:53 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.339 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Preparing to wait for external event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.339 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.339 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.340 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.340 187227 DEBUG nova.virt.libvirt.vif [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:31:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-413330110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-413330110',id=4,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-6watjtec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:31:47Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=11258d07-82c4-4bf7-9965-9bd6fa9f6a10,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.341 187227 DEBUG nova.network.os_vif_util [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.341 187227 DEBUG nova.network.os_vif_util [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.341 187227 DEBUG os_vif [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.342 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.342 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.343 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.347 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd926b88-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.348 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd926b88-53, col_values=(('external_ids', {'iface-id': 'fd926b88-53a3-4c34-aaa1-2957370ba65a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:40:ab', 'vm-uuid': '11258d07-82c4-4bf7-9965-9bd6fa9f6a10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.349 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 NetworkManager[55763]: <info>  [1764351113.3505] manager: (tapfd926b88-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.352 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.359 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.362 187227 INFO os_vif [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53')
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.426 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.427 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.427 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] No VIF found with MAC fa:16:3e:4a:40:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:31:53 compute-0 nova_compute[187223]: 2025-11-28 17:31:53.427 187227 INFO nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Using config drive
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.289 187227 INFO nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Creating config drive at /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.config
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.295 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi2wqm7vy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.424 187227 DEBUG oslo_concurrency.processutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi2wqm7vy" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:31:54 compute-0 kernel: tapfd926b88-53: entered promiscuous mode
Nov 28 17:31:54 compute-0 NetworkManager[55763]: <info>  [1764351114.5081] manager: (tapfd926b88-53): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Nov 28 17:31:54 compute-0 ovn_controller[95574]: 2025-11-28T17:31:54Z|00036|binding|INFO|Claiming lport fd926b88-53a3-4c34-aaa1-2957370ba65a for this chassis.
Nov 28 17:31:54 compute-0 ovn_controller[95574]: 2025-11-28T17:31:54Z|00037|binding|INFO|fd926b88-53a3-4c34-aaa1-2957370ba65a: Claiming fa:16:3e:4a:40:ab 10.100.0.14
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.510 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.517 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:40:ab 10.100.0.14'], port_security=['fa:16:3e:4a:40:ab 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '11258d07-82c4-4bf7-9965-9bd6fa9f6a10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=fd926b88-53a3-4c34-aaa1-2957370ba65a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.518 104433 INFO neutron.agent.ovn.metadata.agent [-] Port fd926b88-53a3-4c34-aaa1-2957370ba65a in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 bound to our chassis
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.520 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:31:54 compute-0 ovn_controller[95574]: 2025-11-28T17:31:54Z|00038|binding|INFO|Setting lport fd926b88-53a3-4c34-aaa1-2957370ba65a ovn-installed in OVS
Nov 28 17:31:54 compute-0 ovn_controller[95574]: 2025-11-28T17:31:54Z|00039|binding|INFO|Setting lport fd926b88-53a3-4c34-aaa1-2957370ba65a up in Southbound
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.526 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.531 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.547 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9942acef-fa4f-41b1-8be6-9c471d05e99a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 systemd-machined[153517]: New machine qemu-3-instance-00000004.
Nov 28 17:31:54 compute-0 systemd-udevd[209311]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:31:54 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Nov 28 17:31:54 compute-0 NetworkManager[55763]: <info>  [1764351114.5786] device (tapfd926b88-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:31:54 compute-0 NetworkManager[55763]: <info>  [1764351114.5801] device (tapfd926b88-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.581 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[ea6ffec3-8879-40dc-9780-f4b5583d54f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.585 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe6498e-2fc5-48e7-ae79-8ccfb6f8f981]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.618 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[efc6006b-e07c-4362-9d03-e6cb53d3f059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.645 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[40b9b8e2-f64d-4a12-a94b-857f00661378]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209322, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.670 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[60c671a8-4994-430c-b1b8-413d0feee0a9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209324, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209324, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.673 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.675 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.677 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.678 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.678 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.678 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:54.679 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.983 187227 DEBUG nova.compute.manager [req-b7cbd2c1-b2c3-4ce0-9fbf-01469c6af7fa req-93be1963-80cf-4d7f-a48c-59b560289cce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.984 187227 DEBUG oslo_concurrency.lockutils [req-b7cbd2c1-b2c3-4ce0-9fbf-01469c6af7fa req-93be1963-80cf-4d7f-a48c-59b560289cce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.984 187227 DEBUG oslo_concurrency.lockutils [req-b7cbd2c1-b2c3-4ce0-9fbf-01469c6af7fa req-93be1963-80cf-4d7f-a48c-59b560289cce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.984 187227 DEBUG oslo_concurrency.lockutils [req-b7cbd2c1-b2c3-4ce0-9fbf-01469c6af7fa req-93be1963-80cf-4d7f-a48c-59b560289cce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:54 compute-0 nova_compute[187223]: 2025-11-28 17:31:54.984 187227 DEBUG nova.compute.manager [req-b7cbd2c1-b2c3-4ce0-9fbf-01469c6af7fa req-93be1963-80cf-4d7f-a48c-59b560289cce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Processing event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.053 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.053 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351115.0523791, 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.054 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] VM Started (Lifecycle Event)
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.059 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.065 187227 INFO nova.virt.libvirt.driver [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Instance spawned successfully.
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.065 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.076 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.080 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.089 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.090 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.090 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.091 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.091 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.092 187227 DEBUG nova.virt.libvirt.driver [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.097 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.098 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351115.0556855, 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.098 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] VM Paused (Lifecycle Event)
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.125 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.130 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351115.0582926, 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.131 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] VM Resumed (Lifecycle Event)
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.148 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.152 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.156 187227 INFO nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Took 7.14 seconds to spawn the instance on the hypervisor.
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.157 187227 DEBUG nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.185 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.218 187227 INFO nova.compute.manager [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Took 7.83 seconds to build instance.
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.243 187227 DEBUG oslo_concurrency.lockutils [None req-e021a4d0-4144-4175-a851-5774baf4c63c f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.377 187227 DEBUG nova.network.neutron [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Updated VIF entry in instance network info cache for port fd926b88-53a3-4c34-aaa1-2957370ba65a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.378 187227 DEBUG nova.network.neutron [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Updating instance_info_cache with network_info: [{"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:31:55 compute-0 nova_compute[187223]: 2025-11-28 17:31:55.392 187227 DEBUG oslo_concurrency.lockutils [req-6383c906-277d-44a0-a957-1a7a8edb70cd req-414e4e5f-3c7c-47e9-a439-d1ae9ed26cf1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-11258d07-82c4-4bf7-9965-9bd6fa9f6a10" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:31:56 compute-0 podman[209332]: 2025-11-28 17:31:56.244177352 +0000 UTC m=+0.080828423 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public)
Nov 28 17:31:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:31:57.024 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.155 187227 DEBUG nova.compute.manager [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.156 187227 DEBUG oslo_concurrency.lockutils [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.156 187227 DEBUG oslo_concurrency.lockutils [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.156 187227 DEBUG oslo_concurrency.lockutils [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.157 187227 DEBUG nova.compute.manager [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] No waiting events found dispatching network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:31:57 compute-0 nova_compute[187223]: 2025-11-28 17:31:57.157 187227 WARNING nova.compute.manager [req-ef2526f0-c23e-4953-b0df-f4076d73ec3c req-bae47f3b-8a1a-413f-b082-7052d4cf9759 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received unexpected event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a for instance with vm_state active and task_state None.
Nov 28 17:31:58 compute-0 nova_compute[187223]: 2025-11-28 17:31:58.004 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:58 compute-0 nova_compute[187223]: 2025-11-28 17:31:58.350 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:31:59 compute-0 podman[197556]: time="2025-11-28T17:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:31:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:31:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: ERROR   17:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: ERROR   17:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: ERROR   17:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: ERROR   17:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: ERROR   17:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:32:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:32:03 compute-0 nova_compute[187223]: 2025-11-28 17:32:03.051 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:03 compute-0 nova_compute[187223]: 2025-11-28 17:32:03.353 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:07 compute-0 ovn_controller[95574]: 2025-11-28T17:32:07Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:40:ab 10.100.0.14
Nov 28 17:32:07 compute-0 ovn_controller[95574]: 2025-11-28T17:32:07Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:40:ab 10.100.0.14
Nov 28 17:32:08 compute-0 nova_compute[187223]: 2025-11-28 17:32:08.050 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:08 compute-0 nova_compute[187223]: 2025-11-28 17:32:08.355 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:11 compute-0 podman[209365]: 2025-11-28 17:32:11.268870746 +0000 UTC m=+0.093017971 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:32:13 compute-0 nova_compute[187223]: 2025-11-28 17:32:13.054 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:13 compute-0 nova_compute[187223]: 2025-11-28 17:32:13.358 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:17 compute-0 nova_compute[187223]: 2025-11-28 17:32:17.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:17 compute-0 nova_compute[187223]: 2025-11-28 17:32:17.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:32:17 compute-0 nova_compute[187223]: 2025-11-28 17:32:17.871 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:32:18 compute-0 nova_compute[187223]: 2025-11-28 17:32:18.057 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:18 compute-0 nova_compute[187223]: 2025-11-28 17:32:18.360 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:20 compute-0 podman[209403]: 2025-11-28 17:32:20.210846377 +0000 UTC m=+0.061500556 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 17:32:21 compute-0 nova_compute[187223]: 2025-11-28 17:32:21.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:22 compute-0 nova_compute[187223]: 2025-11-28 17:32:22.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:22 compute-0 nova_compute[187223]: 2025-11-28 17:32:22.711 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:32:23 compute-0 nova_compute[187223]: 2025-11-28 17:32:23.061 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:23 compute-0 nova_compute[187223]: 2025-11-28 17:32:23.362 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:23 compute-0 nova_compute[187223]: 2025-11-28 17:32:23.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:24 compute-0 podman[209422]: 2025-11-28 17:32:24.236210528 +0000 UTC m=+0.084166941 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd)
Nov 28 17:32:24 compute-0 podman[209423]: 2025-11-28 17:32:24.26490361 +0000 UTC m=+0.118075846 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:32:24 compute-0 nova_compute[187223]: 2025-11-28 17:32:24.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:24 compute-0 nova_compute[187223]: 2025-11-28 17:32:24.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:25 compute-0 nova_compute[187223]: 2025-11-28 17:32:25.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:25 compute-0 nova_compute[187223]: 2025-11-28 17:32:25.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:25 compute-0 nova_compute[187223]: 2025-11-28 17:32:25.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.727 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.728 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.728 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.728 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.811 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:26 compute-0 podman[209472]: 2025-11-28 17:32:26.845286123 +0000 UTC m=+0.064950607 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.870 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.871 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.924 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.931 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.986 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:26 compute-0 nova_compute[187223]: 2025-11-28 17:32:26.987 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.058 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.064 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.123 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.125 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.179 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.348 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.349 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5385MB free_disk=73.2585678100586GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.349 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.350 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.512 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance ea967bd2-166d-4969-ad81-03f2528ed4f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.513 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 8ce9d497-2a8a-4c42-b93f-5778740cbc9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.513 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.513 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.513 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:32:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:27.678 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:27.680 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:27.681 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.823 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.848 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.875 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:32:27 compute-0 nova_compute[187223]: 2025-11-28 17:32:27.876 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:28 compute-0 nova_compute[187223]: 2025-11-28 17:32:28.064 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:28 compute-0 nova_compute[187223]: 2025-11-28 17:32:28.364 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:29 compute-0 podman[197556]: time="2025-11-28T17:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:32:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:32:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 28 17:32:29 compute-0 nova_compute[187223]: 2025-11-28 17:32:29.855 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:29 compute-0 nova_compute[187223]: 2025-11-28 17:32:29.856 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:32:30 compute-0 nova_compute[187223]: 2025-11-28 17:32:30.188 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:32:30 compute-0 nova_compute[187223]: 2025-11-28 17:32:30.188 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:32:30 compute-0 nova_compute[187223]: 2025-11-28 17:32:30.188 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: ERROR   17:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: ERROR   17:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: ERROR   17:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: ERROR   17:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: ERROR   17:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:32:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:32:32 compute-0 nova_compute[187223]: 2025-11-28 17:32:32.130 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updating instance_info_cache with network_info: [{"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:32:32 compute-0 nova_compute[187223]: 2025-11-28 17:32:32.163 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-ea967bd2-166d-4969-ad81-03f2528ed4f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:32:32 compute-0 nova_compute[187223]: 2025-11-28 17:32:32.164 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:32:32 compute-0 nova_compute[187223]: 2025-11-28 17:32:32.165 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:33 compute-0 nova_compute[187223]: 2025-11-28 17:32:33.067 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:33 compute-0 nova_compute[187223]: 2025-11-28 17:32:33.375 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:33.476 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:32:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:33.478 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:32:33 compute-0 nova_compute[187223]: 2025-11-28 17:32:33.478 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:33 compute-0 nova_compute[187223]: 2025-11-28 17:32:33.988 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:38 compute-0 nova_compute[187223]: 2025-11-28 17:32:38.070 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:38 compute-0 nova_compute[187223]: 2025-11-28 17:32:38.377 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:40.482 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:42 compute-0 podman[209524]: 2025-11-28 17:32:42.251492063 +0000 UTC m=+0.095830813 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:32:43 compute-0 nova_compute[187223]: 2025-11-28 17:32:43.074 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:43 compute-0 nova_compute[187223]: 2025-11-28 17:32:43.421 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:45 compute-0 nova_compute[187223]: 2025-11-28 17:32:45.475 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Creating tmpfile /var/lib/nova/instances/tmpudfs435k to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:32:45 compute-0 nova_compute[187223]: 2025-11-28 17:32:45.698 187227 DEBUG nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpudfs435k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:32:46 compute-0 nova_compute[187223]: 2025-11-28 17:32:46.779 187227 DEBUG nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpudfs435k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cad1bde5-a40e-4d4b-a51f-1930b1a66cb6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:32:46 compute-0 nova_compute[187223]: 2025-11-28 17:32:46.820 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:32:46 compute-0 nova_compute[187223]: 2025-11-28 17:32:46.820 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:32:46 compute-0 nova_compute[187223]: 2025-11-28 17:32:46.821 187227 DEBUG nova.network.neutron [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.076 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.423 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.678 187227 DEBUG nova.network.neutron [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Updating instance_info_cache with network_info: [{"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.707 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.709 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpudfs435k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cad1bde5-a40e-4d4b-a51f-1930b1a66cb6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.710 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Creating instance directory: /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.710 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Creating disk.info with the contents: {'/var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk': 'qcow2', '/var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.711 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.712 187227 DEBUG nova.objects.instance [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.743 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.844 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.845 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.846 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.860 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.960 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:48 compute-0 nova_compute[187223]: 2025-11-28 17:32:48.961 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.009 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.010 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.011 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.071 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.073 187227 DEBUG nova.virt.disk.api [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.074 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.157 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.159 187227 DEBUG nova.virt.disk.api [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.160 187227 DEBUG nova.objects.instance [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.180 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.212 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.216 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config to /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.217 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.799 187227 DEBUG oslo_concurrency.processutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6/disk.config /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.800 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.802 187227 DEBUG nova.virt.libvirt.vif [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:31:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-375892467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-375892467',id=3,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:31:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-jduii4z4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:31:42Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=cad1bde5-a40e-4d4b-a51f-1930b1a66cb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.803 187227 DEBUG nova.network.os_vif_util [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.804 187227 DEBUG nova.network.os_vif_util [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.805 187227 DEBUG os_vif [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.806 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.806 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.807 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.810 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.810 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap129faec9-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.811 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap129faec9-e6, col_values=(('external_ids', {'iface-id': '129faec9-e6b1-488e-b6bb-5b2e828e626f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:63:1c', 'vm-uuid': 'cad1bde5-a40e-4d4b-a51f-1930b1a66cb6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.813 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:49 compute-0 NetworkManager[55763]: <info>  [1764351169.8161] manager: (tap129faec9-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.817 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.823 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.824 187227 INFO os_vif [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6')
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.825 187227 DEBUG nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:32:49 compute-0 nova_compute[187223]: 2025-11-28 17:32:49.825 187227 DEBUG nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpudfs435k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cad1bde5-a40e-4d4b-a51f-1930b1a66cb6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.480 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.510 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Triggering sync for uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.511 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Triggering sync for uuid ea967bd2-166d-4969-ad81-03f2528ed4f5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.511 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Triggering sync for uuid 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.512 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.512 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.512 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.512 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.513 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.513 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.603 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.603 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:50 compute-0 nova_compute[187223]: 2025-11-28 17:32:50.632 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:51 compute-0 podman[209570]: 2025-11-28 17:32:51.205841752 +0000 UTC m=+0.064476897 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:32:51 compute-0 nova_compute[187223]: 2025-11-28 17:32:51.552 187227 DEBUG nova.network.neutron [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Port 129faec9-e6b1-488e-b6bb-5b2e828e626f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:32:51 compute-0 nova_compute[187223]: 2025-11-28 17:32:51.554 187227 DEBUG nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpudfs435k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cad1bde5-a40e-4d4b-a51f-1930b1a66cb6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:32:51 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:32:51 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:32:51 compute-0 kernel: tap129faec9-e6: entered promiscuous mode
Nov 28 17:32:51 compute-0 NetworkManager[55763]: <info>  [1764351171.9290] manager: (tap129faec9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Nov 28 17:32:51 compute-0 nova_compute[187223]: 2025-11-28 17:32:51.930 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:51 compute-0 ovn_controller[95574]: 2025-11-28T17:32:51Z|00040|binding|INFO|Claiming lport 129faec9-e6b1-488e-b6bb-5b2e828e626f for this additional chassis.
Nov 28 17:32:51 compute-0 ovn_controller[95574]: 2025-11-28T17:32:51Z|00041|binding|INFO|129faec9-e6b1-488e-b6bb-5b2e828e626f: Claiming fa:16:3e:77:63:1c 10.100.0.5
Nov 28 17:32:51 compute-0 ovn_controller[95574]: 2025-11-28T17:32:51Z|00042|binding|INFO|Setting lport 129faec9-e6b1-488e-b6bb-5b2e828e626f ovn-installed in OVS
Nov 28 17:32:51 compute-0 nova_compute[187223]: 2025-11-28 17:32:51.953 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:51 compute-0 nova_compute[187223]: 2025-11-28 17:32:51.957 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:51 compute-0 systemd-udevd[209620]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:32:51 compute-0 NetworkManager[55763]: <info>  [1764351171.9820] device (tap129faec9-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:32:51 compute-0 NetworkManager[55763]: <info>  [1764351171.9837] device (tap129faec9-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:32:52 compute-0 systemd-machined[153517]: New machine qemu-4-instance-00000003.
Nov 28 17:32:52 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Nov 28 17:32:52 compute-0 nova_compute[187223]: 2025-11-28 17:32:52.687 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351172.6860726, cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:32:52 compute-0 nova_compute[187223]: 2025-11-28 17:32:52.688 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] VM Started (Lifecycle Event)
Nov 28 17:32:52 compute-0 nova_compute[187223]: 2025-11-28 17:32:52.719 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.079 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.582 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351173.582325, cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.583 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] VM Resumed (Lifecycle Event)
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.620 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.624 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:32:53 compute-0 nova_compute[187223]: 2025-11-28 17:32:53.758 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:32:54 compute-0 ovn_controller[95574]: 2025-11-28T17:32:54Z|00043|binding|INFO|Claiming lport 129faec9-e6b1-488e-b6bb-5b2e828e626f for this chassis.
Nov 28 17:32:54 compute-0 ovn_controller[95574]: 2025-11-28T17:32:54Z|00044|binding|INFO|129faec9-e6b1-488e-b6bb-5b2e828e626f: Claiming fa:16:3e:77:63:1c 10.100.0.5
Nov 28 17:32:54 compute-0 ovn_controller[95574]: 2025-11-28T17:32:54Z|00045|binding|INFO|Setting lport 129faec9-e6b1-488e-b6bb-5b2e828e626f up in Southbound
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.697 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:63:1c 10.100.0.5'], port_security=['fa:16:3e:77:63:1c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cad1bde5-a40e-4d4b-a51f-1930b1a66cb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=129faec9-e6b1-488e-b6bb-5b2e828e626f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.698 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 129faec9-e6b1-488e-b6bb-5b2e828e626f in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 bound to our chassis
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.700 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.727 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[836ecee3-5cab-4d7f-800b-5a0a9f5580ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.765 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[4ae6cdea-f836-4b0b-87f8-bee99bc06856]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.768 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[868ee567-4a3f-4a7d-a96e-29b65629db05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.800 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[1df7b221-12ff-40f7-be2a-5e06d27b044d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 nova_compute[187223]: 2025-11-28 17:32:54.817 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.820 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[dff7fa9d-d0f5-426c-a08f-efb541770428]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209666, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 nova_compute[187223]: 2025-11-28 17:32:54.827 187227 INFO nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Post operation of migration started
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.844 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[25fbbfd3-47f3-4a0a-9cfe-e3f3eae28419]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209667, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209667, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.847 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:54 compute-0 nova_compute[187223]: 2025-11-28 17:32:54.849 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:54 compute-0 nova_compute[187223]: 2025-11-28 17:32:54.850 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.853 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.854 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.854 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:32:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:32:54.854 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:32:55 compute-0 podman[209668]: 2025-11-28 17:32:55.217894065 +0000 UTC m=+0.073045254 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 17:32:55 compute-0 podman[209669]: 2025-11-28 17:32:55.276041047 +0000 UTC m=+0.131216127 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 28 17:32:55 compute-0 nova_compute[187223]: 2025-11-28 17:32:55.341 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:32:55 compute-0 nova_compute[187223]: 2025-11-28 17:32:55.342 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:32:55 compute-0 nova_compute[187223]: 2025-11-28 17:32:55.342 187227 DEBUG nova.network.neutron [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:32:57 compute-0 podman[209714]: 2025-11-28 17:32:57.220244069 +0000 UTC m=+0.065241719 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public)
Nov 28 17:32:58 compute-0 nova_compute[187223]: 2025-11-28 17:32:58.082 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.217 187227 DEBUG nova.network.neutron [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Updating instance_info_cache with network_info: [{"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.253 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.274 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.275 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.275 187227 DEBUG oslo_concurrency.lockutils [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.284 187227 INFO nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:32:59 compute-0 virtqemud[186845]: Domain id=4 name='instance-00000003' uuid=cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 is tainted: custom-monitor
Nov 28 17:32:59 compute-0 podman[197556]: time="2025-11-28T17:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:32:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:32:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Nov 28 17:32:59 compute-0 nova_compute[187223]: 2025-11-28 17:32:59.861 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:00 compute-0 nova_compute[187223]: 2025-11-28 17:33:00.296 187227 INFO nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:33:01 compute-0 nova_compute[187223]: 2025-11-28 17:33:01.304 187227 INFO nova.virt.libvirt.driver [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:33:01 compute-0 nova_compute[187223]: 2025-11-28 17:33:01.310 187227 DEBUG nova.compute.manager [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:33:01 compute-0 nova_compute[187223]: 2025-11-28 17:33:01.335 187227 DEBUG nova.objects.instance [None req-61cf8680-8fcb-4fe7-8227-ef005cd4d8e4 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: ERROR   17:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: ERROR   17:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: ERROR   17:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: ERROR   17:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: ERROR   17:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:33:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:33:03 compute-0 nova_compute[187223]: 2025-11-28 17:33:03.086 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:04 compute-0 nova_compute[187223]: 2025-11-28 17:33:04.863 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:08 compute-0 nova_compute[187223]: 2025-11-28 17:33:08.088 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:09 compute-0 nova_compute[187223]: 2025-11-28 17:33:09.865 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.092 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 podman[209737]: 2025-11-28 17:33:13.188109868 +0000 UTC m=+0.050010478 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.736 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.737 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.737 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.737 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.737 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.739 187227 INFO nova.compute.manager [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Terminating instance
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.740 187227 DEBUG nova.compute.manager [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:33:13 compute-0 kernel: tapfd926b88-53 (unregistering): left promiscuous mode
Nov 28 17:33:13 compute-0 NetworkManager[55763]: <info>  [1764351193.7695] device (tapfd926b88-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:33:13 compute-0 ovn_controller[95574]: 2025-11-28T17:33:13Z|00046|binding|INFO|Releasing lport fd926b88-53a3-4c34-aaa1-2957370ba65a from this chassis (sb_readonly=0)
Nov 28 17:33:13 compute-0 ovn_controller[95574]: 2025-11-28T17:33:13Z|00047|binding|INFO|Setting lport fd926b88-53a3-4c34-aaa1-2957370ba65a down in Southbound
Nov 28 17:33:13 compute-0 ovn_controller[95574]: 2025-11-28T17:33:13Z|00048|binding|INFO|Removing iface tapfd926b88-53 ovn-installed in OVS
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.787 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.802 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.804 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:40:ab 10.100.0.14'], port_security=['fa:16:3e:4a:40:ab 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '11258d07-82c4-4bf7-9965-9bd6fa9f6a10', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=fd926b88-53a3-4c34-aaa1-2957370ba65a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.808 104433 INFO neutron.agent.ovn.metadata.agent [-] Port fd926b88-53a3-4c34-aaa1-2957370ba65a in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 unbound from our chassis
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.810 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.827 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4e66a6d1-3390-4a68-a8f5-b46f731bd84e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 28 17:33:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 16.341s CPU time.
Nov 28 17:33:13 compute-0 systemd-machined[153517]: Machine qemu-3-instance-00000004 terminated.
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.868 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[195820f2-a4bf-4917-9b9d-e8742ac5b1d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.872 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[79e2d593-bfc9-4e7a-91fe-138d41868dbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.907 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[53790f78-ce31-4164-9a15-a3c4236a00c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.925 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d4594c23-b011-41c6-9f7b-413f3d7332cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209772, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.943 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d5932cd7-a8eb-4af3-88a1-4acdecbd0372]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209773, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209773, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.944 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.945 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 nova_compute[187223]: 2025-11-28 17:33:13.951 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.952 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.952 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.953 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:13 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:13.953 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.017 187227 INFO nova.virt.libvirt.driver [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Instance destroyed successfully.
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.018 187227 DEBUG nova.objects.instance [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'resources' on Instance uuid 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.045 187227 DEBUG nova.virt.libvirt.vif [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:31:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-413330110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-413330110',id=4,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:31:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-6watjtec',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:31:55Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=11258d07-82c4-4bf7-9965-9bd6fa9f6a10,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.046 187227 DEBUG nova.network.os_vif_util [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "address": "fa:16:3e:4a:40:ab", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd926b88-53", "ovs_interfaceid": "fd926b88-53a3-4c34-aaa1-2957370ba65a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.047 187227 DEBUG nova.network.os_vif_util [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.047 187227 DEBUG os_vif [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.051 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.052 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd926b88-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.054 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.057 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.060 187227 INFO os_vif [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:40:ab,bridge_name='br-int',has_traffic_filtering=True,id=fd926b88-53a3-4c34-aaa1-2957370ba65a,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd926b88-53')
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.060 187227 INFO nova.virt.libvirt.driver [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Deleting instance files /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10_del
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.061 187227 INFO nova.virt.libvirt.driver [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Deletion of /var/lib/nova/instances/11258d07-82c4-4bf7-9965-9bd6fa9f6a10_del complete
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.139 187227 DEBUG nova.virt.libvirt.host [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.139 187227 INFO nova.virt.libvirt.host [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] UEFI support detected
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.141 187227 INFO nova.compute.manager [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.142 187227 DEBUG oslo.service.loopingcall [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.142 187227 DEBUG nova.compute.manager [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.142 187227 DEBUG nova.network.neutron [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.480 187227 DEBUG nova.compute.manager [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-unplugged-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.481 187227 DEBUG oslo_concurrency.lockutils [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.481 187227 DEBUG oslo_concurrency.lockutils [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.482 187227 DEBUG oslo_concurrency.lockutils [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.482 187227 DEBUG nova.compute.manager [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] No waiting events found dispatching network-vif-unplugged-fd926b88-53a3-4c34-aaa1-2957370ba65a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:14 compute-0 nova_compute[187223]: 2025-11-28 17:33:14.482 187227 DEBUG nova.compute.manager [req-76e75608-aa32-47ab-b614-b1d944a2fc15 req-5857de88-38ac-4340-b57f-0cb1d66c80f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-unplugged-fd926b88-53a3-4c34-aaa1-2957370ba65a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.302 187227 DEBUG nova.network.neutron [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.325 187227 INFO nova.compute.manager [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Took 1.18 seconds to deallocate network for instance.
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.386 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.386 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.389 187227 DEBUG nova.compute.manager [req-269c45af-d0ee-4ded-a54e-81809ee0319d req-e224645d-9ba1-4e41-a2b3-5367781158a8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-deleted-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.507 187227 DEBUG nova.compute.provider_tree [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.551 187227 DEBUG nova.scheduler.client.report [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.585 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.632 187227 INFO nova.scheduler.client.report [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Deleted allocations for instance 11258d07-82c4-4bf7-9965-9bd6fa9f6a10
Nov 28 17:33:15 compute-0 nova_compute[187223]: 2025-11-28 17:33:15.754 187227 DEBUG oslo_concurrency.lockutils [None req-304b69cb-25e7-47c4-828d-a0fd82a61884 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.017s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.048 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.049 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.049 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.050 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.050 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.051 187227 INFO nova.compute.manager [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Terminating instance
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.052 187227 DEBUG nova.compute.manager [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:33:16 compute-0 kernel: tap129faec9-e6 (unregistering): left promiscuous mode
Nov 28 17:33:16 compute-0 NetworkManager[55763]: <info>  [1764351196.0789] device (tap129faec9-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.082 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 ovn_controller[95574]: 2025-11-28T17:33:16Z|00049|binding|INFO|Releasing lport 129faec9-e6b1-488e-b6bb-5b2e828e626f from this chassis (sb_readonly=0)
Nov 28 17:33:16 compute-0 ovn_controller[95574]: 2025-11-28T17:33:16Z|00050|binding|INFO|Setting lport 129faec9-e6b1-488e-b6bb-5b2e828e626f down in Southbound
Nov 28 17:33:16 compute-0 ovn_controller[95574]: 2025-11-28T17:33:16Z|00051|binding|INFO|Removing iface tap129faec9-e6 ovn-installed in OVS
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.087 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.094 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:63:1c 10.100.0.5'], port_security=['fa:16:3e:77:63:1c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'cad1bde5-a40e-4d4b-a51f-1930b1a66cb6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '13', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=129faec9-e6b1-488e-b6bb-5b2e828e626f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.095 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 129faec9-e6b1-488e-b6bb-5b2e828e626f in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 unbound from our chassis
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.097 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.103 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.113 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[02e1502e-eea0-445b-92ee-1a26e6e16029]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 28 17:33:16 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 2.304s CPU time.
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.143 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[9966b8bd-0b09-4f15-857f-28496872e855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 systemd-machined[153517]: Machine qemu-4-instance-00000003 terminated.
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.146 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b21f4e59-f233-4396-b3b6-edc713adc41a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.172 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[56edc830-8c7b-4eef-bade-4a2a250deb46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.189 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0dbfedd0-814a-4ff7-b111-531ce23b55dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209802, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.204 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4966374c-eca5-4c6b-b94d-885240d7d6bc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209803, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209803, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.206 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.256 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.262 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.262 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.263 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.263 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:16.264 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.305 187227 INFO nova.virt.libvirt.driver [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Instance destroyed successfully.
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.305 187227 DEBUG nova.objects.instance [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'resources' on Instance uuid cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.329 187227 DEBUG nova.virt.libvirt.vif [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:31:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-375892467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-375892467',id=3,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:31:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-jduii4z4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:33:01Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=cad1bde5-a40e-4d4b-a51f-1930b1a66cb6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.330 187227 DEBUG nova.network.os_vif_util [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "address": "fa:16:3e:77:63:1c", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap129faec9-e6", "ovs_interfaceid": "129faec9-e6b1-488e-b6bb-5b2e828e626f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.330 187227 DEBUG nova.network.os_vif_util [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.330 187227 DEBUG os_vif [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.332 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.332 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap129faec9-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.334 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.337 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.339 187227 INFO os_vif [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:63:1c,bridge_name='br-int',has_traffic_filtering=True,id=129faec9-e6b1-488e-b6bb-5b2e828e626f,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap129faec9-e6')
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.340 187227 INFO nova.virt.libvirt.driver [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Deleting instance files /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6_del
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.341 187227 INFO nova.virt.libvirt.driver [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Deletion of /var/lib/nova/instances/cad1bde5-a40e-4d4b-a51f-1930b1a66cb6_del complete
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.398 187227 INFO nova.compute.manager [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.399 187227 DEBUG oslo.service.loopingcall [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.399 187227 DEBUG nova.compute.manager [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:33:16 compute-0 nova_compute[187223]: 2025-11-28 17:33:16.399 187227 DEBUG nova.network.neutron [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.050 187227 DEBUG nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.050 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.050 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.051 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "11258d07-82c4-4bf7-9965-9bd6fa9f6a10-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.051 187227 DEBUG nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] No waiting events found dispatching network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.051 187227 WARNING nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Received unexpected event network-vif-plugged-fd926b88-53a3-4c34-aaa1-2957370ba65a for instance with vm_state deleted and task_state None.
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.051 187227 DEBUG nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Received event network-vif-unplugged-129faec9-e6b1-488e-b6bb-5b2e828e626f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.052 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.052 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.052 187227 DEBUG oslo_concurrency.lockutils [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.052 187227 DEBUG nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] No waiting events found dispatching network-vif-unplugged-129faec9-e6b1-488e-b6bb-5b2e828e626f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:17 compute-0 nova_compute[187223]: 2025-11-28 17:33:17.052 187227 DEBUG nova.compute.manager [req-e8fb4599-1023-4829-bb66-b6672c4c5198 req-615a6c37-6e16-4d3e-9eb9-a5306fdbd430 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Received event network-vif-unplugged-129faec9-e6b1-488e-b6bb-5b2e828e626f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:33:18 compute-0 nova_compute[187223]: 2025-11-28 17:33:18.093 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.214 187227 DEBUG nova.network.neutron [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.238 187227 INFO nova.compute.manager [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Took 2.84 seconds to deallocate network for instance.
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.546 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.547 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.551 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.623 187227 DEBUG nova.compute.manager [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Received event network-vif-plugged-129faec9-e6b1-488e-b6bb-5b2e828e626f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.624 187227 DEBUG oslo_concurrency.lockutils [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.624 187227 DEBUG oslo_concurrency.lockutils [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.624 187227 DEBUG oslo_concurrency.lockutils [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.624 187227 DEBUG nova.compute.manager [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] No waiting events found dispatching network-vif-plugged-129faec9-e6b1-488e-b6bb-5b2e828e626f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.625 187227 WARNING nova.compute.manager [req-bd6739ba-001e-416e-837d-208869a58ea9 req-87efcf21-624f-4b30-9d57-9c5072b79763 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Received unexpected event network-vif-plugged-129faec9-e6b1-488e-b6bb-5b2e828e626f for instance with vm_state deleted and task_state None.
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.656 187227 INFO nova.scheduler.client.report [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Deleted allocations for instance cad1bde5-a40e-4d4b-a51f-1930b1a66cb6
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.729 187227 DEBUG nova.compute.manager [req-06ea4aef-15b7-4a3f-a8d5-3ef97bcc1070 req-ff269379-c4b2-4698-b3a0-11e4d4eb7362 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Received event network-vif-deleted-129faec9-e6b1-488e-b6bb-5b2e828e626f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:19 compute-0 nova_compute[187223]: 2025-11-28 17:33:19.762 187227 DEBUG oslo_concurrency.lockutils [None req-02d3e982-70f9-4bcd-a428-8a8b24c68329 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "cad1bde5-a40e-4d4b-a51f-1930b1a66cb6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.335 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.827 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.827 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.828 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.828 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.829 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.831 187227 INFO nova.compute.manager [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Terminating instance
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.834 187227 DEBUG nova.compute.manager [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:33:21 compute-0 kernel: tap5d57189a-27 (unregistering): left promiscuous mode
Nov 28 17:33:21 compute-0 NetworkManager[55763]: <info>  [1764351201.8662] device (tap5d57189a-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:33:21 compute-0 ovn_controller[95574]: 2025-11-28T17:33:21Z|00052|binding|INFO|Releasing lport 5d57189a-27f6-43f1-8c3f-e6c6389babcd from this chassis (sb_readonly=0)
Nov 28 17:33:21 compute-0 ovn_controller[95574]: 2025-11-28T17:33:21Z|00053|binding|INFO|Setting lport 5d57189a-27f6-43f1-8c3f-e6c6389babcd down in Southbound
Nov 28 17:33:21 compute-0 ovn_controller[95574]: 2025-11-28T17:33:21Z|00054|binding|INFO|Removing iface tap5d57189a-27 ovn-installed in OVS
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.875 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.882 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:21 compute-0 nova_compute[187223]: 2025-11-28 17:33:21.906 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:21 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 28 17:33:21 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 18.555s CPU time.
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.934 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:bf:e7 10.100.0.12'], port_security=['fa:16:3e:9d:bf:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ea967bd2-166d-4969-ad81-03f2528ed4f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=5d57189a-27f6-43f1-8c3f-e6c6389babcd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.935 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 5d57189a-27f6-43f1-8c3f-e6c6389babcd in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 unbound from our chassis
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.936 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 015f34bb-5da1-42eb-bab2-066f32a46dd5
Nov 28 17:33:21 compute-0 systemd-machined[153517]: Machine qemu-1-instance-00000002 terminated.
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.960 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6d76e701-9d7f-4644-bed5-68029ba0be30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.995 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b6630777-e142-4bd7-b0d7-e7ebcf40660f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:21.999 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[09e1a25f-718a-4fac-b9c4-9c5a929e500b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:22 compute-0 podman[209825]: 2025-11-28 17:33:22.012338414 +0000 UTC m=+0.083455976 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.026 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ece435-6a5d-49e3-b8d4-215511400464]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.046 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cb87a3b8-a10d-44d0-a0f3-15ddbdf9c6da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap015f34bb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3c:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424119, 'reachable_time': 33234, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209852, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.057 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.062 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.064 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7697d32d-0134-4426-8c39-c4732271c68c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424137, 'tstamp': 424137}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209855, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap015f34bb-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424140, 'tstamp': 424140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209855, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.066 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.067 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.071 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.071 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap015f34bb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.072 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.072 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap015f34bb-50, col_values=(('external_ids', {'iface-id': '2de820a4-0104-4404-a104-bd64f5ebe5e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:22.072 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.097 187227 INFO nova.virt.libvirt.driver [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Instance destroyed successfully.
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.098 187227 DEBUG nova.objects.instance [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'resources' on Instance uuid ea967bd2-166d-4969-ad81-03f2528ed4f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.117 187227 DEBUG nova.virt.libvirt.vif [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:30:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-73137176',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-73137176',id=2,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:31:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-130rbgod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:31:03Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=ea967bd2-166d-4969-ad81-03f2528ed4f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.118 187227 DEBUG nova.network.os_vif_util [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "address": "fa:16:3e:9d:bf:e7", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d57189a-27", "ovs_interfaceid": "5d57189a-27f6-43f1-8c3f-e6c6389babcd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.119 187227 DEBUG nova.network.os_vif_util [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.119 187227 DEBUG os_vif [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.121 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.121 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d57189a-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.167 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.169 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.171 187227 INFO os_vif [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:bf:e7,bridge_name='br-int',has_traffic_filtering=True,id=5d57189a-27f6-43f1-8c3f-e6c6389babcd,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d57189a-27')
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.171 187227 INFO nova.virt.libvirt.driver [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Deleting instance files /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5_del
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.172 187227 INFO nova.virt.libvirt.driver [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Deletion of /var/lib/nova/instances/ea967bd2-166d-4969-ad81-03f2528ed4f5_del complete
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.238 187227 INFO nova.compute.manager [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.238 187227 DEBUG oslo.service.loopingcall [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.239 187227 DEBUG nova.compute.manager [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.239 187227 DEBUG nova.network.neutron [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.620 187227 DEBUG nova.compute.manager [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-unplugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.622 187227 DEBUG oslo_concurrency.lockutils [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.622 187227 DEBUG oslo_concurrency.lockutils [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.623 187227 DEBUG oslo_concurrency.lockutils [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.623 187227 DEBUG nova.compute.manager [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] No waiting events found dispatching network-vif-unplugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:22 compute-0 nova_compute[187223]: 2025-11-28 17:33:22.623 187227 DEBUG nova.compute.manager [req-7f681a2f-f6b4-4f27-9cb8-ffb6ca238837 req-16551be1-e808-4eb8-90ad-d452402ad38d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-unplugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:33:23 compute-0 nova_compute[187223]: 2025-11-28 17:33:23.096 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:23 compute-0 nova_compute[187223]: 2025-11-28 17:33:23.717 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.374 187227 DEBUG nova.network.neutron [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.407 187227 INFO nova.compute.manager [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Took 2.17 seconds to deallocate network for instance.
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.458 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.459 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.514 187227 DEBUG nova.compute.manager [req-753198d9-cb08-48ec-849d-1f3cfd8fa436 req-362af20f-34af-41bf-8e41-a1290505ddb3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-deleted-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.558 187227 DEBUG nova.compute.provider_tree [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.581 187227 DEBUG nova.scheduler.client.report [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.614 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.644 187227 INFO nova.scheduler.client.report [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Deleted allocations for instance ea967bd2-166d-4969-ad81-03f2528ed4f5
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.778 187227 DEBUG oslo_concurrency.lockutils [None req-637d8eb0-c383-4c1e-b314-d57cf45eb50b f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.950s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.794 187227 DEBUG nova.compute.manager [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.794 187227 DEBUG oslo_concurrency.lockutils [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.795 187227 DEBUG oslo_concurrency.lockutils [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.795 187227 DEBUG oslo_concurrency.lockutils [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "ea967bd2-166d-4969-ad81-03f2528ed4f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.795 187227 DEBUG nova.compute.manager [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] No waiting events found dispatching network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:24 compute-0 nova_compute[187223]: 2025-11-28 17:33:24.796 187227 WARNING nova.compute.manager [req-41be6fea-1dcf-4938-b3d1-7f0d49351ccc req-690b8632-5691-46a2-9057-9fd1e18dc0e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Received unexpected event network-vif-plugged-5d57189a-27f6-43f1-8c3f-e6c6389babcd for instance with vm_state deleted and task_state None.
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.801 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.803 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.803 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.805 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.806 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.807 187227 INFO nova.compute.manager [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Terminating instance
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.809 187227 DEBUG nova.compute.manager [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:33:25 compute-0 kernel: tap17cde755-7c (unregistering): left promiscuous mode
Nov 28 17:33:25 compute-0 NetworkManager[55763]: <info>  [1764351205.8372] device (tap17cde755-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:33:25 compute-0 ovn_controller[95574]: 2025-11-28T17:33:25Z|00055|binding|INFO|Releasing lport 17cde755-7c4b-416f-b8da-a328887f9819 from this chassis (sb_readonly=0)
Nov 28 17:33:25 compute-0 ovn_controller[95574]: 2025-11-28T17:33:25Z|00056|binding|INFO|Setting lport 17cde755-7c4b-416f-b8da-a328887f9819 down in Southbound
Nov 28 17:33:25 compute-0 ovn_controller[95574]: 2025-11-28T17:33:25Z|00057|binding|INFO|Removing iface tap17cde755-7c ovn-installed in OVS
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.846 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:25.861 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:32:1a 10.100.0.8'], port_security=['fa:16:3e:4e:32:1a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8ce9d497-2a8a-4c42-b93f-5778740cbc9b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea47683b97094cc99b882a5a1b90949f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '82cdfce6-8f2d-44f3-bd0a-80dabea6bfd5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d8a34e-2c92-41ae-a2d1-bdb3f1fafb55, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=17cde755-7c4b-416f-b8da-a328887f9819) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:33:25 compute-0 nova_compute[187223]: 2025-11-28 17:33:25.862 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:25.864 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 17cde755-7c4b-416f-b8da-a328887f9819 in datapath 015f34bb-5da1-42eb-bab2-066f32a46dd5 unbound from our chassis
Nov 28 17:33:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:25.865 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 015f34bb-5da1-42eb-bab2-066f32a46dd5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:33:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:25.866 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3abb64b7-2f52-4578-9fc7-ec7afa838188]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:25.867 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5 namespace which is not needed anymore
Nov 28 17:33:25 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 28 17:33:25 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 17.839s CPU time.
Nov 28 17:33:25 compute-0 systemd-machined[153517]: Machine qemu-2-instance-00000001 terminated.
Nov 28 17:33:25 compute-0 podman[209875]: 2025-11-28 17:33:25.949909403 +0000 UTC m=+0.068589805 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [NOTICE]   (208925) : haproxy version is 2.8.14-c23fe91
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [NOTICE]   (208925) : path to executable is /usr/sbin/haproxy
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [WARNING]  (208925) : Exiting Master process...
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [WARNING]  (208925) : Exiting Master process...
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [ALERT]    (208925) : Current worker (208927) exited with code 143 (Terminated)
Nov 28 17:33:26 compute-0 podman[209877]: 2025-11-28 17:33:26.002984189 +0000 UTC m=+0.122368991 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:33:26 compute-0 neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5[208921]: [WARNING]  (208925) : All workers exited. Exiting... (0)
Nov 28 17:33:26 compute-0 systemd[1]: libpod-88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1.scope: Deactivated successfully.
Nov 28 17:33:26 compute-0 podman[209935]: 2025-11-28 17:33:26.011514695 +0000 UTC m=+0.046072403 container died 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.034 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.038 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1-userdata-shm.mount: Deactivated successfully.
Nov 28 17:33:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-f69dfa19d87e1ff8d2c78c2d3813f2d69f5299be571fa11735ce35cd0a66f638-merged.mount: Deactivated successfully.
Nov 28 17:33:26 compute-0 podman[209935]: 2025-11-28 17:33:26.05454306 +0000 UTC m=+0.089100768 container cleanup 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:33:26 compute-0 systemd[1]: libpod-conmon-88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1.scope: Deactivated successfully.
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.074 187227 INFO nova.virt.libvirt.driver [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Instance destroyed successfully.
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.074 187227 DEBUG nova.objects.instance [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lazy-loading 'resources' on Instance uuid 8ce9d497-2a8a-4c42-b93f-5778740cbc9b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.089 187227 DEBUG nova.virt.libvirt.vif [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:30:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-415079104',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-415079104',id=1,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:31:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ea47683b97094cc99b882a5a1b90949f',ramdisk_id='',reservation_id='r-y95xlee7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-544878882',owner_user_name='tempest-TestExecuteActionsViaActuator-544878882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:31:34Z,user_data=None,user_id='f7ca965410e74fcabced6e50aab5d096',uuid=8ce9d497-2a8a-4c42-b93f-5778740cbc9b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.089 187227 DEBUG nova.network.os_vif_util [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converting VIF {"id": "17cde755-7c4b-416f-b8da-a328887f9819", "address": "fa:16:3e:4e:32:1a", "network": {"id": "015f34bb-5da1-42eb-bab2-066f32a46dd5", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-213410289-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ea47683b97094cc99b882a5a1b90949f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17cde755-7c", "ovs_interfaceid": "17cde755-7c4b-416f-b8da-a328887f9819", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.090 187227 DEBUG nova.network.os_vif_util [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.090 187227 DEBUG os_vif [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.092 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.092 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17cde755-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.094 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.095 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.097 187227 INFO os_vif [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:32:1a,bridge_name='br-int',has_traffic_filtering=True,id=17cde755-7c4b-416f-b8da-a328887f9819,network=Network(015f34bb-5da1-42eb-bab2-066f32a46dd5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap17cde755-7c')
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.097 187227 INFO nova.virt.libvirt.driver [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Deleting instance files /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b_del
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.103 187227 INFO nova.virt.libvirt.driver [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Deletion of /var/lib/nova/instances/8ce9d497-2a8a-4c42-b93f-5778740cbc9b_del complete
Nov 28 17:33:26 compute-0 podman[209982]: 2025-11-28 17:33:26.120145458 +0000 UTC m=+0.041953425 container remove 88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.125 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8b8980-da43-4aec-adc3-c4d98f8dfaf7]: (4, ('Fri Nov 28 05:33:25 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5 (88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1)\n88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1\nFri Nov 28 05:33:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5 (88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1)\n88ccfd8f2dab68b51ef911882a9edd57018a2466c55178d8364938a9b9f6d9b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.127 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[db08631e-6685-4d7a-b82b-9a18d1d6b794]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.127 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap015f34bb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.129 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 kernel: tap015f34bb-50: left promiscuous mode
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.140 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.143 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2ddaac60-43dc-4285-b003-964dd8c752fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.158 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2abb9153-c2df-4aa7-bf25-8fbac30eaa72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.159 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8c48fd-a6da-4978-915e-d79bcd3e4423]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.173 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f6ff30-a7ee-4f97-84e0-c556a415dc47]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424100, 'reachable_time': 18444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209995, 'error': None, 'target': 'ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d015f34bb\x2d5da1\x2d42eb\x2dbab2\x2d066f32a46dd5.mount: Deactivated successfully.
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.188 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-015f34bb-5da1-42eb-bab2-066f32a46dd5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:33:26 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:26.189 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d7f111-0da8-4326-b414-1f6ec6577427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.228 187227 INFO nova.compute.manager [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.232 187227 DEBUG oslo.service.loopingcall [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.232 187227 DEBUG nova.compute.manager [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.233 187227 DEBUG nova.network.neutron [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.952 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.953 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.953 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:26 compute-0 nova_compute[187223]: 2025-11-28 17:33:26.953 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.008 187227 DEBUG nova.compute.manager [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.008 187227 DEBUG oslo_concurrency.lockutils [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.009 187227 DEBUG oslo_concurrency.lockutils [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.009 187227 DEBUG oslo_concurrency.lockutils [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.009 187227 DEBUG nova.compute.manager [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.009 187227 DEBUG nova.compute.manager [req-86c7c46f-aa8c-44d2-b67a-bbbdc6b210fb req-f76d3b8e-83ac-4a59-b0e9-813493f25c90 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-unplugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.123 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.125 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5875MB free_disk=73.344970703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.125 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.125 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.201 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 8ce9d497-2a8a-4c42-b93f-5778740cbc9b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.201 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.201 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.284 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.420 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.462 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.462 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.670 187227 DEBUG nova.network.neutron [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:33:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:27.678 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:27.679 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:27.679 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.693 187227 INFO nova.compute.manager [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Took 1.46 seconds to deallocate network for instance.
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.762 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.763 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.810 187227 DEBUG nova.compute.provider_tree [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.826 187227 DEBUG nova.scheduler.client.report [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.856 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.906 187227 INFO nova.scheduler.client.report [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Deleted allocations for instance 8ce9d497-2a8a-4c42-b93f-5778740cbc9b
Nov 28 17:33:27 compute-0 nova_compute[187223]: 2025-11-28 17:33:27.986 187227 DEBUG oslo_concurrency.lockutils [None req-d4d4840d-f498-4ac7-b254-e73264dd8972 f7ca965410e74fcabced6e50aab5d096 ea47683b97094cc99b882a5a1b90949f - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:28 compute-0 nova_compute[187223]: 2025-11-28 17:33:28.098 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:28 compute-0 podman[209998]: 2025-11-28 17:33:28.203363824 +0000 UTC m=+0.060152691 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6)
Nov 28 17:33:28 compute-0 nova_compute[187223]: 2025-11-28 17:33:28.462 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.015 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351194.0131226, 11258d07-82c4-4bf7-9965-9bd6fa9f6a10 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.015 187227 INFO nova.compute.manager [-] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] VM Stopped (Lifecycle Event)
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.041 187227 DEBUG nova.compute.manager [None req-31dd3085-822d-4a70-840d-378ba9df23de - - - - - -] [instance: 11258d07-82c4-4bf7-9965-9bd6fa9f6a10] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.117 187227 DEBUG nova.compute.manager [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.117 187227 DEBUG oslo_concurrency.lockutils [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.117 187227 DEBUG oslo_concurrency.lockutils [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.118 187227 DEBUG oslo_concurrency.lockutils [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "8ce9d497-2a8a-4c42-b93f-5778740cbc9b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.118 187227 DEBUG nova.compute.manager [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] No waiting events found dispatching network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.118 187227 WARNING nova.compute.manager [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received unexpected event network-vif-plugged-17cde755-7c4b-416f-b8da-a328887f9819 for instance with vm_state deleted and task_state None.
Nov 28 17:33:29 compute-0 nova_compute[187223]: 2025-11-28 17:33:29.118 187227 DEBUG nova.compute.manager [req-7a0f5237-e177-41bc-a359-b9a519e6936c req-cd6c3d72-1bf5-4df3-a189-4767404de7cc 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Received event network-vif-deleted-17cde755-7c4b-416f-b8da-a328887f9819 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:33:29 compute-0 podman[197556]: time="2025-11-28T17:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:33:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:33:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2582 "" "Go-http-client/1.1"
Nov 28 17:33:30 compute-0 nova_compute[187223]: 2025-11-28 17:33:30.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:30 compute-0 nova_compute[187223]: 2025-11-28 17:33:30.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:33:30 compute-0 nova_compute[187223]: 2025-11-28 17:33:30.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:33:30 compute-0 nova_compute[187223]: 2025-11-28 17:33:30.702 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:33:30 compute-0 nova_compute[187223]: 2025-11-28 17:33:30.702 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:31 compute-0 nova_compute[187223]: 2025-11-28 17:33:31.097 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:31 compute-0 nova_compute[187223]: 2025-11-28 17:33:31.303 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351196.3023317, cad1bde5-a40e-4d4b-a51f-1930b1a66cb6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:33:31 compute-0 nova_compute[187223]: 2025-11-28 17:33:31.303 187227 INFO nova.compute.manager [-] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] VM Stopped (Lifecycle Event)
Nov 28 17:33:31 compute-0 nova_compute[187223]: 2025-11-28 17:33:31.337 187227 DEBUG nova.compute.manager [None req-6defcd01-ad48-4d73-a32e-0f2dd1d58268 - - - - - -] [instance: cad1bde5-a40e-4d4b-a51f-1930b1a66cb6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: ERROR   17:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: ERROR   17:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: ERROR   17:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: ERROR   17:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: ERROR   17:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:33:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:33:31 compute-0 nova_compute[187223]: 2025-11-28 17:33:31.698 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:33:33 compute-0 nova_compute[187223]: 2025-11-28 17:33:33.101 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:35.821 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:33:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:35.822 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:33:35 compute-0 nova_compute[187223]: 2025-11-28 17:33:35.822 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:36 compute-0 nova_compute[187223]: 2025-11-28 17:33:36.125 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:37 compute-0 nova_compute[187223]: 2025-11-28 17:33:37.097 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351202.0951335, ea967bd2-166d-4969-ad81-03f2528ed4f5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:33:37 compute-0 nova_compute[187223]: 2025-11-28 17:33:37.098 187227 INFO nova.compute.manager [-] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] VM Stopped (Lifecycle Event)
Nov 28 17:33:37 compute-0 nova_compute[187223]: 2025-11-28 17:33:37.397 187227 DEBUG nova.compute.manager [None req-a4b99c38-dd78-443b-9d32-41a53c7c7a42 - - - - - -] [instance: ea967bd2-166d-4969-ad81-03f2528ed4f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:33:38 compute-0 nova_compute[187223]: 2025-11-28 17:33:38.104 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:33:38.825 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:33:41 compute-0 nova_compute[187223]: 2025-11-28 17:33:41.073 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351206.072185, 8ce9d497-2a8a-4c42-b93f-5778740cbc9b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:33:41 compute-0 nova_compute[187223]: 2025-11-28 17:33:41.074 187227 INFO nova.compute.manager [-] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] VM Stopped (Lifecycle Event)
Nov 28 17:33:41 compute-0 nova_compute[187223]: 2025-11-28 17:33:41.110 187227 DEBUG nova.compute.manager [None req-7b7e83fd-e8e1-48ce-93f8-34e7fb52f8b0 - - - - - -] [instance: 8ce9d497-2a8a-4c42-b93f-5778740cbc9b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:33:41 compute-0 nova_compute[187223]: 2025-11-28 17:33:41.128 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:43 compute-0 nova_compute[187223]: 2025-11-28 17:33:43.107 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:44 compute-0 podman[210017]: 2025-11-28 17:33:44.233422373 +0000 UTC m=+0.073940791 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:33:46 compute-0 nova_compute[187223]: 2025-11-28 17:33:46.132 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:48 compute-0 nova_compute[187223]: 2025-11-28 17:33:48.109 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:51 compute-0 nova_compute[187223]: 2025-11-28 17:33:51.135 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:52 compute-0 podman[210042]: 2025-11-28 17:33:52.199369877 +0000 UTC m=+0.059944195 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 17:33:53 compute-0 nova_compute[187223]: 2025-11-28 17:33:53.112 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:56 compute-0 nova_compute[187223]: 2025-11-28 17:33:56.174 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:56 compute-0 podman[210062]: 2025-11-28 17:33:56.247864228 +0000 UTC m=+0.073956781 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:33:56 compute-0 podman[210063]: 2025-11-28 17:33:56.277897476 +0000 UTC m=+0.103987689 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:33:58 compute-0 nova_compute[187223]: 2025-11-28 17:33:58.114 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:33:59 compute-0 podman[210107]: 2025-11-28 17:33:59.205912641 +0000 UTC m=+0.065448024 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:33:59 compute-0 podman[197556]: time="2025-11-28T17:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:33:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:33:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Nov 28 17:34:01 compute-0 nova_compute[187223]: 2025-11-28 17:34:01.177 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: ERROR   17:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: ERROR   17:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: ERROR   17:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: ERROR   17:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: ERROR   17:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:34:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:34:03 compute-0 nova_compute[187223]: 2025-11-28 17:34:03.116 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:06 compute-0 nova_compute[187223]: 2025-11-28 17:34:06.210 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:08 compute-0 nova_compute[187223]: 2025-11-28 17:34:08.118 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:11 compute-0 nova_compute[187223]: 2025-11-28 17:34:11.214 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:13 compute-0 nova_compute[187223]: 2025-11-28 17:34:13.120 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:15 compute-0 podman[210129]: 2025-11-28 17:34:15.216965731 +0000 UTC m=+0.080116789 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:34:16 compute-0 nova_compute[187223]: 2025-11-28 17:34:16.249 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:18 compute-0 nova_compute[187223]: 2025-11-28 17:34:18.123 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:18 compute-0 ovn_controller[95574]: 2025-11-28T17:34:18Z|00058|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 17:34:21 compute-0 nova_compute[187223]: 2025-11-28 17:34:21.252 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:23 compute-0 nova_compute[187223]: 2025-11-28 17:34:23.125 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:23 compute-0 podman[210153]: 2025-11-28 17:34:23.196202031 +0000 UTC m=+0.059137672 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 17:34:25 compute-0 nova_compute[187223]: 2025-11-28 17:34:25.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:26 compute-0 nova_compute[187223]: 2025-11-28 17:34:26.256 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:26 compute-0 nova_compute[187223]: 2025-11-28 17:34:26.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:26 compute-0 nova_compute[187223]: 2025-11-28 17:34:26.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:26 compute-0 nova_compute[187223]: 2025-11-28 17:34:26.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:34:27 compute-0 podman[210174]: 2025-11-28 17:34:27.215953888 +0000 UTC m=+0.068255625 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:34:27 compute-0 podman[210175]: 2025-11-28 17:34:27.25541606 +0000 UTC m=+0.104489164 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:34:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:27.679 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:27.680 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:27.681 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.792 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.792 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.793 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.793 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.951 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.952 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5889MB free_disk=73.3452262878418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.952 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:27 compute-0 nova_compute[187223]: 2025-11-28 17:34:27.953 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.048 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.048 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.066 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance f333fabf-4a60-49fb-b6dc-d0cbeb847c8f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.066 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.066 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.073 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.119 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.127 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.147 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.154 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.179 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.180 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.180 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.186 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.187 187227 INFO nova.compute.claims [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.317 187227 DEBUG nova.compute.provider_tree [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.340 187227 DEBUG nova.scheduler.client.report [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.363 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.364 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.420 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.421 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.467 187227 INFO nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.503 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.645 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.648 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.649 187227 INFO nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Creating image(s)
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.650 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.650 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.651 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.665 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.737 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.738 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.739 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.752 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.812 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.813 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.854 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.855 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.856 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.924 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.926 187227 DEBUG nova.virt.disk.api [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Checking if we can resize image /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.926 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.966 187227 DEBUG nova.policy [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2fe03a116c1411ebaa81bbd0334f5ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd63484fb636c435b8307abd484cb8aa7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.998 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.999 187227 DEBUG nova.virt.disk.api [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Cannot resize image /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:34:28 compute-0 nova_compute[187223]: 2025-11-28 17:34:28.999 187227 DEBUG nova.objects.instance [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lazy-loading 'migration_context' on Instance uuid f333fabf-4a60-49fb-b6dc-d0cbeb847c8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.015 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.015 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Ensure instance console log exists: /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.016 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.016 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.017 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.176 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.206 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:29 compute-0 nova_compute[187223]: 2025-11-28 17:34:29.697 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Successfully created port: b2417a6a-f805-4c80-adc3-9a9223d1c16a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:34:29 compute-0 podman[197556]: time="2025-11-28T17:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:34:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:34:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Nov 28 17:34:30 compute-0 podman[210232]: 2025-11-28 17:34:30.241226665 +0000 UTC m=+0.092138476 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, version=9.6, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 28 17:34:30 compute-0 nova_compute[187223]: 2025-11-28 17:34:30.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.260 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: ERROR   17:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: ERROR   17:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: ERROR   17:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: ERROR   17:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: ERROR   17:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:34:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.710 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.710 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.836 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Successfully updated port: b2417a6a-f805-4c80-adc3-9a9223d1c16a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.901 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.902 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquired lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:34:31 compute-0 nova_compute[187223]: 2025-11-28 17:34:31.902 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:34:32 compute-0 nova_compute[187223]: 2025-11-28 17:34:32.089 187227 DEBUG nova.compute.manager [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-changed-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:34:32 compute-0 nova_compute[187223]: 2025-11-28 17:34:32.089 187227 DEBUG nova.compute.manager [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Refreshing instance network info cache due to event network-changed-b2417a6a-f805-4c80-adc3-9a9223d1c16a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:34:32 compute-0 nova_compute[187223]: 2025-11-28 17:34:32.089 187227 DEBUG oslo_concurrency.lockutils [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:34:32 compute-0 nova_compute[187223]: 2025-11-28 17:34:32.403 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:34:32 compute-0 nova_compute[187223]: 2025-11-28 17:34:32.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.130 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.424 187227 DEBUG nova.network.neutron [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updating instance_info_cache with network_info: [{"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.457 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Releasing lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.458 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Instance network_info: |[{"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.458 187227 DEBUG oslo_concurrency.lockutils [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.459 187227 DEBUG nova.network.neutron [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Refreshing network info cache for port b2417a6a-f805-4c80-adc3-9a9223d1c16a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.461 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Start _get_guest_xml network_info=[{"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.466 187227 WARNING nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.472 187227 DEBUG nova.virt.libvirt.host [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.473 187227 DEBUG nova.virt.libvirt.host [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.477 187227 DEBUG nova.virt.libvirt.host [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.478 187227 DEBUG nova.virt.libvirt.host [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.479 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.480 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.480 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.480 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.480 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.481 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.481 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.481 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.481 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.481 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.482 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.482 187227 DEBUG nova.virt.hardware [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.485 187227 DEBUG nova.virt.libvirt.vif [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-322511206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-322511206',id=7,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d63484fb636c435b8307abd484cb8aa7',ramdisk_id='',reservation_id='r-5fc603x4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-245891966',owner_user_name='tempest-TestExecuteBasicStrategy-245891966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:34:28Z,user_data=None,user_id='a2fe03a116c1411ebaa81bbd0334f5ed',uuid=f333fabf-4a60-49fb-b6dc-d0cbeb847c8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.485 187227 DEBUG nova.network.os_vif_util [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converting VIF {"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.486 187227 DEBUG nova.network.os_vif_util [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.487 187227 DEBUG nova.objects.instance [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f333fabf-4a60-49fb-b6dc-d0cbeb847c8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.561 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <uuid>f333fabf-4a60-49fb-b6dc-d0cbeb847c8f</uuid>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <name>instance-00000007</name>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteBasicStrategy-server-322511206</nova:name>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:34:33</nova:creationTime>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:user uuid="a2fe03a116c1411ebaa81bbd0334f5ed">tempest-TestExecuteBasicStrategy-245891966-project-member</nova:user>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:project uuid="d63484fb636c435b8307abd484cb8aa7">tempest-TestExecuteBasicStrategy-245891966</nova:project>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         <nova:port uuid="b2417a6a-f805-4c80-adc3-9a9223d1c16a">
Nov 28 17:34:33 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <system>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="serial">f333fabf-4a60-49fb-b6dc-d0cbeb847c8f</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="uuid">f333fabf-4a60-49fb-b6dc-d0cbeb847c8f</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </system>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <os>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </os>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <features>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </features>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.config"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:28:45:e5"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <target dev="tapb2417a6a-f8"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/console.log" append="off"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <video>
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </video>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:34:33 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:34:33 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:34:33 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:34:33 compute-0 nova_compute[187223]: </domain>
Nov 28 17:34:33 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.563 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Preparing to wait for external event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.564 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.564 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.564 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.565 187227 DEBUG nova.virt.libvirt.vif [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-322511206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-322511206',id=7,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d63484fb636c435b8307abd484cb8aa7',ramdisk_id='',reservation_id='r-5fc603x4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-245891966',owner_user_name='tempest-TestExecuteBasicStrategy-245891966-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:34:28Z,user_data=None,user_id='a2fe03a116c1411ebaa81bbd0334f5ed',uuid=f333fabf-4a60-49fb-b6dc-d0cbeb847c8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.565 187227 DEBUG nova.network.os_vif_util [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converting VIF {"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.566 187227 DEBUG nova.network.os_vif_util [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.566 187227 DEBUG os_vif [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.566 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.567 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.567 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.572 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.572 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2417a6a-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.573 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2417a6a-f8, col_values=(('external_ids', {'iface-id': 'b2417a6a-f805-4c80-adc3-9a9223d1c16a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:45:e5', 'vm-uuid': 'f333fabf-4a60-49fb-b6dc-d0cbeb847c8f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:33 compute-0 NetworkManager[55763]: <info>  [1764351273.6054] manager: (tapb2417a6a-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.603 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.607 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.618 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.619 187227 INFO os_vif [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8')
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.668 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.668 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.669 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] No VIF found with MAC fa:16:3e:28:45:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:34:33 compute-0 nova_compute[187223]: 2025-11-28 17:34:33.669 187227 INFO nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Using config drive
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.115 187227 INFO nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Creating config drive at /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.config
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.123 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8djuvyl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.252 187227 DEBUG oslo_concurrency.processutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8djuvyl" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:34:34 compute-0 kernel: tapb2417a6a-f8: entered promiscuous mode
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.3278] manager: (tapb2417a6a-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Nov 28 17:34:34 compute-0 ovn_controller[95574]: 2025-11-28T17:34:34Z|00059|binding|INFO|Claiming lport b2417a6a-f805-4c80-adc3-9a9223d1c16a for this chassis.
Nov 28 17:34:34 compute-0 ovn_controller[95574]: 2025-11-28T17:34:34Z|00060|binding|INFO|b2417a6a-f805-4c80-adc3-9a9223d1c16a: Claiming fa:16:3e:28:45:e5 10.100.0.12
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.329 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.352 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:45:e5 10.100.0.12'], port_security=['fa:16:3e:28:45:e5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f333fabf-4a60-49fb-b6dc-d0cbeb847c8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd63484fb636c435b8307abd484cb8aa7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '157785b8-acb3-45e0-be55-3b141f81f23f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bfaf540-d584-4e5a-842f-49ecdc70c0d8, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=b2417a6a-f805-4c80-adc3-9a9223d1c16a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.353 104433 INFO neutron.agent.ovn.metadata.agent [-] Port b2417a6a-f805-4c80-adc3-9a9223d1c16a in datapath 2ab0e112-4ca7-4d63-9a9b-4898471ce300 bound to our chassis
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.355 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ab0e112-4ca7-4d63-9a9b-4898471ce300
Nov 28 17:34:34 compute-0 systemd-udevd[210269]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.369 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[190afdad-911b-4caa-8064-c48ef63b6c70]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.371 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2ab0e112-41 in ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.374 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2ab0e112-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.374 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bac3c8e1-8c82-4e81-a7b5-d5c70b8fce7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.3754] device (tapb2417a6a-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.375 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4b66debc-b84f-4fe5-8cf0-a439dcfacc59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.3769] device (tapb2417a6a-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:34:34 compute-0 systemd-machined[153517]: New machine qemu-5-instance-00000007.
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.398 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 ovn_controller[95574]: 2025-11-28T17:34:34Z|00061|binding|INFO|Setting lport b2417a6a-f805-4c80-adc3-9a9223d1c16a ovn-installed in OVS
Nov 28 17:34:34 compute-0 ovn_controller[95574]: 2025-11-28T17:34:34Z|00062|binding|INFO|Setting lport b2417a6a-f805-4c80-adc3-9a9223d1c16a up in Southbound
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.397 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[290830df-3fcb-4fd0-b807-d47d465f5b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.400 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.424 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1c921e-b012-4add-be25-c71b2cdb3541]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.458 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f3464180-dee0-4a6d-97ad-3ad1b8c89f50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.4674] manager: (tap2ab0e112-40): new Veth device (/org/freedesktop/NetworkManager/Devices/32)
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.466 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2b71b435-e8a2-403d-a2fa-141f11cf61c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.493 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b48b91-5744-41a4-9c2e-5443bc844430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.497 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[69a59381-49d2-46cc-a274-cc3984ee5ae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.5249] device (tap2ab0e112-40): carrier: link connected
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.531 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d630d006-aaa0-4bc7-8524-0734fb7470f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.551 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[17079368-33b4-4864-9e57-f5cd10fda541]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ab0e112-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:2a:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444915, 'reachable_time': 38683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210305, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.569 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8d278f5e-ff61-4afc-b616-461db7e0dad1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4f:2ada'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444915, 'tstamp': 444915}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210306, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.588 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[310b2825-8496-4146-9ff7-f895b180a5b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ab0e112-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:2a:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444915, 'reachable_time': 38683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210307, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.623 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[131d0740-211e-4307-88ac-02f9adddebc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.699 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2291322c-731f-4baf-9366-36570b38279c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.701 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ab0e112-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.701 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.701 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ab0e112-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:34 compute-0 NetworkManager[55763]: <info>  [1764351274.7046] manager: (tap2ab0e112-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 28 17:34:34 compute-0 kernel: tap2ab0e112-40: entered promiscuous mode
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.705 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.717 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ab0e112-40, col_values=(('external_ids', {'iface-id': 'b81f3cf4-9db1-4ab7-9c60-c031a201b3b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:34 compute-0 ovn_controller[95574]: 2025-11-28T17:34:34Z|00063|binding|INFO|Releasing lport b81f3cf4-9db1-4ab7-9c60-c031a201b3b5 from this chassis (sb_readonly=0)
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.719 187227 DEBUG nova.compute.manager [req-dae3b126-7624-47f7-a69e-ed7659cea3d5 req-602f8786-55d1-48df-b8b4-2927e4e2f0b0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.719 187227 DEBUG oslo_concurrency.lockutils [req-dae3b126-7624-47f7-a69e-ed7659cea3d5 req-602f8786-55d1-48df-b8b4-2927e4e2f0b0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.720 187227 DEBUG oslo_concurrency.lockutils [req-dae3b126-7624-47f7-a69e-ed7659cea3d5 req-602f8786-55d1-48df-b8b4-2927e4e2f0b0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.720 187227 DEBUG oslo_concurrency.lockutils [req-dae3b126-7624-47f7-a69e-ed7659cea3d5 req-602f8786-55d1-48df-b8b4-2927e4e2f0b0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.720 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2ab0e112-4ca7-4d63-9a9b-4898471ce300.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2ab0e112-4ca7-4d63-9a9b-4898471ce300.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.720 187227 DEBUG nova.compute.manager [req-dae3b126-7624-47f7-a69e-ed7659cea3d5 req-602f8786-55d1-48df-b8b4-2927e4e2f0b0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Processing event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.721 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.721 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b7713a4e-da50-43a2-905c-bdf4faab624d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.722 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-2ab0e112-4ca7-4d63-9a9b-4898471ce300
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/2ab0e112-4ca7-4d63-9a9b-4898471ce300.pid.haproxy
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 2ab0e112-4ca7-4d63-9a9b-4898471ce300
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:34:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:34.723 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'env', 'PROCESS_TAG=haproxy-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2ab0e112-4ca7-4d63-9a9b-4898471ce300.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.731 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.760 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351274.759925, f333fabf-4a60-49fb-b6dc-d0cbeb847c8f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.761 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] VM Started (Lifecycle Event)
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.763 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.769 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.774 187227 INFO nova.virt.libvirt.driver [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Instance spawned successfully.
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.775 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.796 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.803 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.829 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.829 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.830 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.830 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.831 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.831 187227 DEBUG nova.virt.libvirt.driver [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.837 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.838 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351274.7600653, f333fabf-4a60-49fb-b6dc-d0cbeb847c8f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.838 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] VM Paused (Lifecycle Event)
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.875 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.890 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351274.766278, f333fabf-4a60-49fb-b6dc-d0cbeb847c8f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.890 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] VM Resumed (Lifecycle Event)
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.923 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.928 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.935 187227 INFO nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Took 6.29 seconds to spawn the instance on the hypervisor.
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.936 187227 DEBUG nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:34:34 compute-0 nova_compute[187223]: 2025-11-28 17:34:34.962 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:34:35 compute-0 nova_compute[187223]: 2025-11-28 17:34:35.037 187227 DEBUG nova.network.neutron [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updated VIF entry in instance network info cache for port b2417a6a-f805-4c80-adc3-9a9223d1c16a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:34:35 compute-0 nova_compute[187223]: 2025-11-28 17:34:35.037 187227 DEBUG nova.network.neutron [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updating instance_info_cache with network_info: [{"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:34:35 compute-0 nova_compute[187223]: 2025-11-28 17:34:35.042 187227 INFO nova.compute.manager [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Took 6.92 seconds to build instance.
Nov 28 17:34:35 compute-0 nova_compute[187223]: 2025-11-28 17:34:35.070 187227 DEBUG oslo_concurrency.lockutils [req-db3ec9d8-9c15-485d-96cd-e76cf1686d27 req-9c55dc6e-e028-4c18-bd37-de67e0488a5c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:34:35 compute-0 nova_compute[187223]: 2025-11-28 17:34:35.071 187227 DEBUG oslo_concurrency.lockutils [None req-37439b2d-a2ef-4d58-aff8-ccf44639b5e8 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:35 compute-0 podman[210346]: 2025-11-28 17:34:35.177333483 +0000 UTC m=+0.061686936 container create b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:34:35 compute-0 systemd[1]: Started libpod-conmon-b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04.scope.
Nov 28 17:34:35 compute-0 podman[210346]: 2025-11-28 17:34:35.145068809 +0000 UTC m=+0.029422272 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:34:35 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:34:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b282ad18e40fd7d1802de594f75fd2ec134ce10e99c853b7bd80f865b9ca028/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:34:35 compute-0 podman[210346]: 2025-11-28 17:34:35.295578553 +0000 UTC m=+0.179932076 container init b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:34:35 compute-0 podman[210346]: 2025-11-28 17:34:35.301631478 +0000 UTC m=+0.185984921 container start b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:34:35 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [NOTICE]   (210366) : New worker (210368) forked
Nov 28 17:34:35 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [NOTICE]   (210366) : Loading success.
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.847 187227 DEBUG nova.compute.manager [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.847 187227 DEBUG oslo_concurrency.lockutils [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.848 187227 DEBUG oslo_concurrency.lockutils [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.848 187227 DEBUG oslo_concurrency.lockutils [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.848 187227 DEBUG nova.compute.manager [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] No waiting events found dispatching network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:34:36 compute-0 nova_compute[187223]: 2025-11-28 17:34:36.849 187227 WARNING nova.compute.manager [req-64c5bfd2-9f9a-4102-a81e-63f38c9945b2 req-497cd3cc-018b-4ecd-8d73-71c3828ac10b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received unexpected event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a for instance with vm_state active and task_state None.
Nov 28 17:34:38 compute-0 nova_compute[187223]: 2025-11-28 17:34:38.163 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:38 compute-0 nova_compute[187223]: 2025-11-28 17:34:38.605 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:42.869 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:34:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:42.871 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:34:42 compute-0 nova_compute[187223]: 2025-11-28 17:34:42.872 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:43 compute-0 nova_compute[187223]: 2025-11-28 17:34:43.166 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:43 compute-0 nova_compute[187223]: 2025-11-28 17:34:43.607 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:45 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:34:45.874 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:34:46 compute-0 podman[210378]: 2025-11-28 17:34:46.209570082 +0000 UTC m=+0.062866129 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:34:48 compute-0 nova_compute[187223]: 2025-11-28 17:34:48.169 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:48 compute-0 ovn_controller[95574]: 2025-11-28T17:34:48Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:45:e5 10.100.0.12
Nov 28 17:34:48 compute-0 ovn_controller[95574]: 2025-11-28T17:34:48Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:45:e5 10.100.0.12
Nov 28 17:34:48 compute-0 nova_compute[187223]: 2025-11-28 17:34:48.610 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:53 compute-0 nova_compute[187223]: 2025-11-28 17:34:53.172 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:53 compute-0 nova_compute[187223]: 2025-11-28 17:34:53.642 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:53 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:34:53 compute-0 podman[210417]: 2025-11-28 17:34:53.96607497 +0000 UTC m=+0.057026751 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:34:58 compute-0 nova_compute[187223]: 2025-11-28 17:34:58.175 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:58 compute-0 podman[210436]: 2025-11-28 17:34:58.199698284 +0000 UTC m=+0.064556139 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 17:34:58 compute-0 podman[210437]: 2025-11-28 17:34:58.271236993 +0000 UTC m=+0.120085295 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:34:58 compute-0 nova_compute[187223]: 2025-11-28 17:34:58.657 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:34:59 compute-0 podman[197556]: time="2025-11-28T17:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:34:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:34:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Nov 28 17:35:01 compute-0 podman[210482]: 2025-11-28 17:35:01.229243425 +0000 UTC m=+0.076318149 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, container_name=openstack_network_exporter)
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: ERROR   17:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: ERROR   17:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: ERROR   17:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: ERROR   17:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: ERROR   17:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:35:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:35:03 compute-0 nova_compute[187223]: 2025-11-28 17:35:03.178 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:03 compute-0 nova_compute[187223]: 2025-11-28 17:35:03.660 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:08 compute-0 nova_compute[187223]: 2025-11-28 17:35:08.181 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:08 compute-0 nova_compute[187223]: 2025-11-28 17:35:08.698 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:12 compute-0 ovn_controller[95574]: 2025-11-28T17:35:12Z|00064|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Nov 28 17:35:13 compute-0 nova_compute[187223]: 2025-11-28 17:35:13.185 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:13 compute-0 nova_compute[187223]: 2025-11-28 17:35:13.701 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:17 compute-0 podman[210504]: 2025-11-28 17:35:17.207824158 +0000 UTC m=+0.065395563 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:35:17 compute-0 sshd-session[210521]: Connection closed by 193.32.162.146 port 59668
Nov 28 17:35:18 compute-0 nova_compute[187223]: 2025-11-28 17:35:18.187 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:18 compute-0 nova_compute[187223]: 2025-11-28 17:35:18.703 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:20 compute-0 nova_compute[187223]: 2025-11-28 17:35:20.793 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Creating tmpfile /var/lib/nova/instances/tmp6li5ac5a to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:35:20 compute-0 nova_compute[187223]: 2025-11-28 17:35:20.795 187227 DEBUG nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6li5ac5a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:35:21 compute-0 nova_compute[187223]: 2025-11-28 17:35:21.978 187227 DEBUG nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6li5ac5a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e81c4034-7fd8-453d-9ece-6ce03cb4aa70',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:35:22 compute-0 nova_compute[187223]: 2025-11-28 17:35:22.001 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:35:22 compute-0 nova_compute[187223]: 2025-11-28 17:35:22.002 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:35:22 compute-0 nova_compute[187223]: 2025-11-28 17:35:22.002 187227 DEBUG nova.network.neutron [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.189 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.451 187227 DEBUG nova.network.neutron [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Updating instance_info_cache with network_info: [{"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.474 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.477 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6li5ac5a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e81c4034-7fd8-453d-9ece-6ce03cb4aa70',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.478 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Creating instance directory: /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.479 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Creating disk.info with the contents: {'/var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk': 'qcow2', '/var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.479 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.480 187227 DEBUG nova.objects.instance [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e81c4034-7fd8-453d-9ece-6ce03cb4aa70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.526 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.614 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.615 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.616 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.631 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.703 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.705 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.725 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.958 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk 1073741824" returned: 0 in 0.253s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.959 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:23 compute-0 nova_compute[187223]: 2025-11-28 17:35:23.960 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.045 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.046 187227 DEBUG nova.virt.disk.api [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.046 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.103 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.105 187227 DEBUG nova.virt.disk.api [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.105 187227 DEBUG nova.objects.instance [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid e81c4034-7fd8-453d-9ece-6ce03cb4aa70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.126 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.173 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config 485376" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.175 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config to /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.175 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:24 compute-0 podman[210545]: 2025-11-28 17:35:24.197611115 +0000 UTC m=+0.058879435 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.772 187227 DEBUG oslo_concurrency.processutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk.config /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70" returned: 0 in 0.596s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.773 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.775 187227 DEBUG nova.virt.libvirt.vif [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:34:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1256551644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1256551644',id=8,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:34:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d63484fb636c435b8307abd484cb8aa7',ramdisk_id='',reservation_id='r-3p9uqqsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-245891966',owner_user_name='tempest-TestExecuteBasicStrategy-245891966-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:34:47Z,user_data=None,user_id='a2fe03a116c1411ebaa81bbd0334f5ed',uuid=e81c4034-7fd8-453d-9ece-6ce03cb4aa70,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.776 187227 DEBUG nova.network.os_vif_util [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.777 187227 DEBUG nova.network.os_vif_util [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.778 187227 DEBUG os_vif [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.779 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.780 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.781 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.783 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.784 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec35ba7f-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.785 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec35ba7f-0d, col_values=(('external_ids', {'iface-id': 'ec35ba7f-0da1-4858-91e5-70aaebb424d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:01:0b', 'vm-uuid': 'e81c4034-7fd8-453d-9ece-6ce03cb4aa70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.787 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:24 compute-0 NetworkManager[55763]: <info>  [1764351324.7887] manager: (tapec35ba7f-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.790 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.797 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.799 187227 INFO os_vif [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d')
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.799 187227 DEBUG nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:35:24 compute-0 nova_compute[187223]: 2025-11-28 17:35:24.800 187227 DEBUG nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6li5ac5a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e81c4034-7fd8-453d-9ece-6ce03cb4aa70',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:35:26 compute-0 nova_compute[187223]: 2025-11-28 17:35:26.521 187227 DEBUG nova.network.neutron [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Port ec35ba7f-0da1-4858-91e5-70aaebb424d4 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:35:26 compute-0 nova_compute[187223]: 2025-11-28 17:35:26.523 187227 DEBUG nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6li5ac5a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e81c4034-7fd8-453d-9ece-6ce03cb4aa70',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:35:26 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:35:26 compute-0 nova_compute[187223]: 2025-11-28 17:35:26.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:26 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:35:26 compute-0 kernel: tapec35ba7f-0d: entered promiscuous mode
Nov 28 17:35:26 compute-0 NetworkManager[55763]: <info>  [1764351326.8721] manager: (tapec35ba7f-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Nov 28 17:35:26 compute-0 ovn_controller[95574]: 2025-11-28T17:35:26Z|00065|binding|INFO|Claiming lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 for this additional chassis.
Nov 28 17:35:26 compute-0 nova_compute[187223]: 2025-11-28 17:35:26.874 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:26 compute-0 ovn_controller[95574]: 2025-11-28T17:35:26Z|00066|binding|INFO|ec35ba7f-0da1-4858-91e5-70aaebb424d4: Claiming fa:16:3e:4a:01:0b 10.100.0.14
Nov 28 17:35:26 compute-0 ovn_controller[95574]: 2025-11-28T17:35:26Z|00067|binding|INFO|Setting lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 ovn-installed in OVS
Nov 28 17:35:26 compute-0 nova_compute[187223]: 2025-11-28 17:35:26.890 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:26 compute-0 systemd-machined[153517]: New machine qemu-6-instance-00000008.
Nov 28 17:35:26 compute-0 systemd-udevd[210606]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:35:26 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Nov 28 17:35:26 compute-0 NetworkManager[55763]: <info>  [1764351326.9408] device (tapec35ba7f-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:35:26 compute-0 NetworkManager[55763]: <info>  [1764351326.9415] device (tapec35ba7f-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.436 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351327.435551, e81c4034-7fd8-453d-9ece-6ce03cb4aa70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.437 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] VM Started (Lifecycle Event)
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.455 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:35:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:27.681 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:27.682 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:27.684 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.717 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.791 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.869 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.870 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.968 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:27 compute-0 nova_compute[187223]: 2025-11-28 17:35:27.973 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.046 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.048 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.121 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.193 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.264 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351328.2643423, e81c4034-7fd8-453d-9ece-6ce03cb4aa70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.265 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] VM Resumed (Lifecycle Event)
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.288 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.292 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.341 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.354 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.355 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5678MB free_disk=73.28740692138672GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.355 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.356 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.396 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Migration for instance e81c4034-7fd8-453d-9ece-6ce03cb4aa70 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.416 187227 INFO nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Updating resource usage from migration 13fde30d-59aa-445a-803f-049582613c26
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.416 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Starting to track incoming migration 13fde30d-59aa-445a-803f-049582613c26 with flavor 6f44bded-bdbe-4623-9c87-afc5919e8381 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.464 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance f333fabf-4a60-49fb-b6dc-d0cbeb847c8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.484 187227 WARNING nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance e81c4034-7fd8-453d-9ece-6ce03cb4aa70 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.486 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.486 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.562 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.594 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.616 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.617 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:28.954 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:35:28 compute-0 nova_compute[187223]: 2025-11-28 17:35:28.955 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:28.957 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:35:29 compute-0 podman[210646]: 2025-11-28 17:35:29.240341735 +0000 UTC m=+0.095710350 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 17:35:29 compute-0 podman[210647]: 2025-11-28 17:35:29.241797327 +0000 UTC m=+0.096046140 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:35:29 compute-0 ovn_controller[95574]: 2025-11-28T17:35:29Z|00068|binding|INFO|Claiming lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 for this chassis.
Nov 28 17:35:29 compute-0 ovn_controller[95574]: 2025-11-28T17:35:29Z|00069|binding|INFO|ec35ba7f-0da1-4858-91e5-70aaebb424d4: Claiming fa:16:3e:4a:01:0b 10.100.0.14
Nov 28 17:35:29 compute-0 ovn_controller[95574]: 2025-11-28T17:35:29Z|00070|binding|INFO|Setting lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 up in Southbound
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.322 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:01:0b 10.100.0.14'], port_security=['fa:16:3e:4a:01:0b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e81c4034-7fd8-453d-9ece-6ce03cb4aa70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd63484fb636c435b8307abd484cb8aa7', 'neutron:revision_number': '11', 'neutron:security_group_ids': '157785b8-acb3-45e0-be55-3b141f81f23f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bfaf540-d584-4e5a-842f-49ecdc70c0d8, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=ec35ba7f-0da1-4858-91e5-70aaebb424d4) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.323 104433 INFO neutron.agent.ovn.metadata.agent [-] Port ec35ba7f-0da1-4858-91e5-70aaebb424d4 in datapath 2ab0e112-4ca7-4d63-9a9b-4898471ce300 bound to our chassis
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.324 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ab0e112-4ca7-4d63-9a9b-4898471ce300
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.342 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4eaaee3d-5a7b-4afb-a2a5-a0a07e4bdda6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.374 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[633192f7-0d08-4001-9310-1b8ade0f6091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.378 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[aad188de-8d66-4d77-8a51-20d1da0e9c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.413 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[85c0f671-407c-46d1-9692-5e6b38dbeeba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.436 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[00873a98-e0a7-403f-8deb-465f1bc187fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ab0e112-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:2a:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444915, 'reachable_time': 44203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210700, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.465 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[74284ecb-7826-49c2-b6d0-6ad9819b3c0d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2ab0e112-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444928, 'tstamp': 444928}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210701, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2ab0e112-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444932, 'tstamp': 444932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210701, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.467 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ab0e112-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.468 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.469 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.470 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ab0e112-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.470 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.470 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ab0e112-40, col_values=(('external_ids', {'iface-id': 'b81f3cf4-9db1-4ab7-9c60-c031a201b3b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.471 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.615 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.616 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:29 compute-0 podman[197556]: time="2025-11-28T17:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:35:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:35:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Nov 28 17:35:29 compute-0 nova_compute[187223]: 2025-11-28 17:35:29.788 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:29 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:29.961 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:30 compute-0 nova_compute[187223]: 2025-11-28 17:35:30.176 187227 INFO nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Post operation of migration started
Nov 28 17:35:30 compute-0 nova_compute[187223]: 2025-11-28 17:35:30.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: ERROR   17:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: ERROR   17:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: ERROR   17:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: ERROR   17:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: ERROR   17:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:35:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:35:31 compute-0 nova_compute[187223]: 2025-11-28 17:35:31.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:31 compute-0 nova_compute[187223]: 2025-11-28 17:35:31.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:35:31 compute-0 nova_compute[187223]: 2025-11-28 17:35:31.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:35:32 compute-0 podman[210702]: 2025-11-28 17:35:32.253165152 +0000 UTC m=+0.089336075 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7)
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.441 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.441 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.442 187227 DEBUG nova.network.neutron [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.453 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.453 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.453 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:35:32 compute-0 nova_compute[187223]: 2025-11-28 17:35:32.454 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f333fabf-4a60-49fb-b6dc-d0cbeb847c8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:35:33 compute-0 nova_compute[187223]: 2025-11-28 17:35:33.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:34 compute-0 nova_compute[187223]: 2025-11-28 17:35:34.791 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.642 187227 DEBUG nova.network.neutron [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Updating instance_info_cache with network_info: [{"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.660 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-e81c4034-7fd8-453d-9ece-6ce03cb4aa70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.677 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.678 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.678 187227 DEBUG oslo_concurrency.lockutils [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.684 187227 INFO nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:35:35 compute-0 virtqemud[186845]: Domain id=6 name='instance-00000008' uuid=e81c4034-7fd8-453d-9ece-6ce03cb4aa70 is tainted: custom-monitor
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.791 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updating instance_info_cache with network_info: [{"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.810 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:35:35 compute-0 nova_compute[187223]: 2025-11-28 17:35:35.811 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:35:36 compute-0 nova_compute[187223]: 2025-11-28 17:35:36.696 187227 INFO nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:35:37 compute-0 nova_compute[187223]: 2025-11-28 17:35:37.706 187227 INFO nova.virt.libvirt.driver [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:35:37 compute-0 nova_compute[187223]: 2025-11-28 17:35:37.714 187227 DEBUG nova.compute.manager [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:35:37 compute-0 nova_compute[187223]: 2025-11-28 17:35:37.739 187227 DEBUG nova.objects.instance [None req-1a71c286-5776-4bcb-bbdb-029a50ee1d8a a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:35:38 compute-0 nova_compute[187223]: 2025-11-28 17:35:38.198 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:38 compute-0 nova_compute[187223]: 2025-11-28 17:35:38.804 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:35:39 compute-0 nova_compute[187223]: 2025-11-28 17:35:39.794 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.201 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.591 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.592 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.592 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.592 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.593 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.594 187227 INFO nova.compute.manager [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Terminating instance
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.595 187227 DEBUG nova.compute.manager [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:35:43 compute-0 kernel: tapec35ba7f-0d (unregistering): left promiscuous mode
Nov 28 17:35:43 compute-0 NetworkManager[55763]: <info>  [1764351343.6311] device (tapec35ba7f-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.642 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 ovn_controller[95574]: 2025-11-28T17:35:43Z|00071|binding|INFO|Releasing lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 from this chassis (sb_readonly=0)
Nov 28 17:35:43 compute-0 ovn_controller[95574]: 2025-11-28T17:35:43Z|00072|binding|INFO|Setting lport ec35ba7f-0da1-4858-91e5-70aaebb424d4 down in Southbound
Nov 28 17:35:43 compute-0 ovn_controller[95574]: 2025-11-28T17:35:43Z|00073|binding|INFO|Removing iface tapec35ba7f-0d ovn-installed in OVS
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.645 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.662 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:01:0b 10.100.0.14'], port_security=['fa:16:3e:4a:01:0b 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e81c4034-7fd8-453d-9ece-6ce03cb4aa70', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd63484fb636c435b8307abd484cb8aa7', 'neutron:revision_number': '13', 'neutron:security_group_ids': '157785b8-acb3-45e0-be55-3b141f81f23f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bfaf540-d584-4e5a-842f-49ecdc70c0d8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=ec35ba7f-0da1-4858-91e5-70aaebb424d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.664 104433 INFO neutron.agent.ovn.metadata.agent [-] Port ec35ba7f-0da1-4858-91e5-70aaebb424d4 in datapath 2ab0e112-4ca7-4d63-9a9b-4898471ce300 unbound from our chassis
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.666 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ab0e112-4ca7-4d63-9a9b-4898471ce300
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.667 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.685 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bd74f347-817d-4cfe-9aea-2c2cc9b70d19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 28 17:35:43 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 1.808s CPU time.
Nov 28 17:35:43 compute-0 systemd-machined[153517]: Machine qemu-6-instance-00000008 terminated.
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.724 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[5704f0eb-d550-4088-9320-f0d2e8ade0de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.728 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b326af28-f8ca-42e8-b48a-c6441151213e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.762 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[ab01fb8d-ee30-4d8f-afe5-b251b037b6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.782 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6adc90b9-9688-4262-b91f-46ec177f726d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ab0e112-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:2a:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444915, 'reachable_time': 44203, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210736, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.801 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fb53ab60-1467-43af-82cf-b413c5cf83fa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2ab0e112-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444928, 'tstamp': 444928}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210737, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2ab0e112-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444932, 'tstamp': 444932}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210737, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.803 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ab0e112-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.805 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.812 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.812 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ab0e112-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.812 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.813 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ab0e112-40, col_values=(('external_ids', {'iface-id': 'b81f3cf4-9db1-4ab7-9c60-c031a201b3b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:43.813 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.880 187227 INFO nova.virt.libvirt.driver [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Instance destroyed successfully.
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.881 187227 DEBUG nova.objects.instance [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lazy-loading 'resources' on Instance uuid e81c4034-7fd8-453d-9ece-6ce03cb4aa70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.894 187227 DEBUG nova.virt.libvirt.vif [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:34:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1256551644',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1256551644',id=8,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:34:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d63484fb636c435b8307abd484cb8aa7',ramdisk_id='',reservation_id='r-3p9uqqsj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-245891966',owner_user_name='tempest-TestExecuteBasicStrategy-245891966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:35:37Z,user_data=None,user_id='a2fe03a116c1411ebaa81bbd0334f5ed',uuid=e81c4034-7fd8-453d-9ece-6ce03cb4aa70,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.894 187227 DEBUG nova.network.os_vif_util [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converting VIF {"id": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "address": "fa:16:3e:4a:01:0b", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35ba7f-0d", "ovs_interfaceid": "ec35ba7f-0da1-4858-91e5-70aaebb424d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.895 187227 DEBUG nova.network.os_vif_util [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.895 187227 DEBUG os_vif [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.897 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.897 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec35ba7f-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.900 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.902 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.903 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.907 187227 INFO os_vif [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:01:0b,bridge_name='br-int',has_traffic_filtering=True,id=ec35ba7f-0da1-4858-91e5-70aaebb424d4,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35ba7f-0d')
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.907 187227 INFO nova.virt.libvirt.driver [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Deleting instance files /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70_del
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.908 187227 INFO nova.virt.libvirt.driver [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Deletion of /var/lib/nova/instances/e81c4034-7fd8-453d-9ece-6ce03cb4aa70_del complete
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.981 187227 INFO nova.compute.manager [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Took 0.39 seconds to destroy the instance on the hypervisor.
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.983 187227 DEBUG oslo.service.loopingcall [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.984 187227 DEBUG nova.compute.manager [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:35:43 compute-0 nova_compute[187223]: 2025-11-28 17:35:43.984 187227 DEBUG nova.network.neutron [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.652 187227 DEBUG nova.compute.manager [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Received event network-vif-unplugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.653 187227 DEBUG oslo_concurrency.lockutils [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.653 187227 DEBUG oslo_concurrency.lockutils [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.654 187227 DEBUG oslo_concurrency.lockutils [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.654 187227 DEBUG nova.compute.manager [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] No waiting events found dispatching network-vif-unplugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.654 187227 DEBUG nova.compute.manager [req-a70314ba-7a16-41e0-9697-35caa3fb4d31 req-12fbc90c-70ac-45d6-9d73-73e4f5715ea1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Received event network-vif-unplugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.892 187227 DEBUG nova.network.neutron [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.915 187227 INFO nova.compute.manager [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Took 0.93 seconds to deallocate network for instance.
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.956 187227 DEBUG nova.compute.manager [req-8b87b3f8-ca3c-45f8-9d86-218a4efdcc8c req-206675a0-d8df-4e92-aa0b-409765f3c7ef 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Received event network-vif-deleted-ec35ba7f-0da1-4858-91e5-70aaebb424d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.959 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.959 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.965 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:44 compute-0 nova_compute[187223]: 2025-11-28 17:35:44.998 187227 INFO nova.scheduler.client.report [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Deleted allocations for instance e81c4034-7fd8-453d-9ece-6ce03cb4aa70
Nov 28 17:35:45 compute-0 nova_compute[187223]: 2025-11-28 17:35:45.069 187227 DEBUG oslo_concurrency.lockutils [None req-cfac9fd3-5e1b-469e-86eb-e2658a239045 a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.335 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.336 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.336 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.337 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.337 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.339 187227 INFO nova.compute.manager [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Terminating instance
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.342 187227 DEBUG nova.compute.manager [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:35:46 compute-0 kernel: tapb2417a6a-f8 (unregistering): left promiscuous mode
Nov 28 17:35:46 compute-0 NetworkManager[55763]: <info>  [1764351346.3658] device (tapb2417a6a-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:35:46 compute-0 ovn_controller[95574]: 2025-11-28T17:35:46Z|00074|binding|INFO|Releasing lport b2417a6a-f805-4c80-adc3-9a9223d1c16a from this chassis (sb_readonly=0)
Nov 28 17:35:46 compute-0 ovn_controller[95574]: 2025-11-28T17:35:46Z|00075|binding|INFO|Setting lport b2417a6a-f805-4c80-adc3-9a9223d1c16a down in Southbound
Nov 28 17:35:46 compute-0 ovn_controller[95574]: 2025-11-28T17:35:46Z|00076|binding|INFO|Removing iface tapb2417a6a-f8 ovn-installed in OVS
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.372 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.375 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.385 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:45:e5 10.100.0.12'], port_security=['fa:16:3e:28:45:e5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f333fabf-4a60-49fb-b6dc-d0cbeb847c8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd63484fb636c435b8307abd484cb8aa7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '157785b8-acb3-45e0-be55-3b141f81f23f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bfaf540-d584-4e5a-842f-49ecdc70c0d8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=b2417a6a-f805-4c80-adc3-9a9223d1c16a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.386 104433 INFO neutron.agent.ovn.metadata.agent [-] Port b2417a6a-f805-4c80-adc3-9a9223d1c16a in datapath 2ab0e112-4ca7-4d63-9a9b-4898471ce300 unbound from our chassis
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.388 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ab0e112-4ca7-4d63-9a9b-4898471ce300, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.389 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e30191-360c-4832-927e-22c7b0513f39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.390 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300 namespace which is not needed anymore
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.392 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 28 17:35:46 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 15.888s CPU time.
Nov 28 17:35:46 compute-0 systemd-machined[153517]: Machine qemu-5-instance-00000007 terminated.
Nov 28 17:35:46 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [NOTICE]   (210366) : haproxy version is 2.8.14-c23fe91
Nov 28 17:35:46 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [NOTICE]   (210366) : path to executable is /usr/sbin/haproxy
Nov 28 17:35:46 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [WARNING]  (210366) : Exiting Master process...
Nov 28 17:35:46 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [ALERT]    (210366) : Current worker (210368) exited with code 143 (Terminated)
Nov 28 17:35:46 compute-0 neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300[210362]: [WARNING]  (210366) : All workers exited. Exiting... (0)
Nov 28 17:35:46 compute-0 systemd[1]: libpod-b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04.scope: Deactivated successfully.
Nov 28 17:35:46 compute-0 podman[210779]: 2025-11-28 17:35:46.555499043 +0000 UTC m=+0.049899605 container died b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:35:46 compute-0 NetworkManager[55763]: <info>  [1764351346.5667] manager: (tapb2417a6a-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.567 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.575 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04-userdata-shm.mount: Deactivated successfully.
Nov 28 17:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-7b282ad18e40fd7d1802de594f75fd2ec134ce10e99c853b7bd80f865b9ca028-merged.mount: Deactivated successfully.
Nov 28 17:35:46 compute-0 podman[210779]: 2025-11-28 17:35:46.602084711 +0000 UTC m=+0.096485253 container cleanup b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:35:46 compute-0 systemd[1]: libpod-conmon-b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04.scope: Deactivated successfully.
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.619 187227 INFO nova.virt.libvirt.driver [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Instance destroyed successfully.
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.619 187227 DEBUG nova.objects.instance [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lazy-loading 'resources' on Instance uuid f333fabf-4a60-49fb-b6dc-d0cbeb847c8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.634 187227 DEBUG nova.virt.libvirt.vif [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-322511206',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-322511206',id=7,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:34:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d63484fb636c435b8307abd484cb8aa7',ramdisk_id='',reservation_id='r-5fc603x4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-245891966',owner_user_name='tempest-TestExecuteBasicStrategy-245891966-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:34:34Z,user_data=None,user_id='a2fe03a116c1411ebaa81bbd0334f5ed',uuid=f333fabf-4a60-49fb-b6dc-d0cbeb847c8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.635 187227 DEBUG nova.network.os_vif_util [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converting VIF {"id": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "address": "fa:16:3e:28:45:e5", "network": {"id": "2ab0e112-4ca7-4d63-9a9b-4898471ce300", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-519856924-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d63484fb636c435b8307abd484cb8aa7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2417a6a-f8", "ovs_interfaceid": "b2417a6a-f805-4c80-adc3-9a9223d1c16a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.636 187227 DEBUG nova.network.os_vif_util [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.636 187227 DEBUG os_vif [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.638 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.639 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2417a6a-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.642 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.645 187227 INFO os_vif [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:45:e5,bridge_name='br-int',has_traffic_filtering=True,id=b2417a6a-f805-4c80-adc3-9a9223d1c16a,network=Network(2ab0e112-4ca7-4d63-9a9b-4898471ce300),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2417a6a-f8')
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.646 187227 INFO nova.virt.libvirt.driver [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Deleting instance files /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f_del
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.647 187227 INFO nova.virt.libvirt.driver [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Deletion of /var/lib/nova/instances/f333fabf-4a60-49fb-b6dc-d0cbeb847c8f_del complete
Nov 28 17:35:46 compute-0 podman[210823]: 2025-11-28 17:35:46.677473171 +0000 UTC m=+0.050778630 container remove b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.685 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d2c06a-39d3-4c07-af30-511407d09286]: (4, ('Fri Nov 28 05:35:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300 (b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04)\nb6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04\nFri Nov 28 05:35:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300 (b6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04)\nb6560664c758b873694d0d67340fc7c4ce75ecc847a9a03edde5f9eedcd32d04\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.688 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8acf0089-9a71-461a-b81d-6bc86f1781db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.690 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ab0e112-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.692 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 kernel: tap2ab0e112-40: left promiscuous mode
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.704 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.706 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a1007033-7ffe-4774-bf02-ace7ac15361e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.709 187227 INFO nova.compute.manager [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.710 187227 DEBUG oslo.service.loopingcall [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.710 187227 DEBUG nova.compute.manager [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.711 187227 DEBUG nova.network.neutron [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.721 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2b46aa33-5ded-4e7f-8757-95fd082ae1a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.724 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a71475-7ff5-4a5f-84da-bad5d99ce45a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.736 187227 DEBUG nova.compute.manager [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Received event network-vif-plugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.736 187227 DEBUG oslo_concurrency.lockutils [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.737 187227 DEBUG oslo_concurrency.lockutils [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.737 187227 DEBUG oslo_concurrency.lockutils [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e81c4034-7fd8-453d-9ece-6ce03cb4aa70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.737 187227 DEBUG nova.compute.manager [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] No waiting events found dispatching network-vif-plugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:35:46 compute-0 nova_compute[187223]: 2025-11-28 17:35:46.737 187227 WARNING nova.compute.manager [req-dfc21d5e-d96e-4640-9ffb-e321729f9f64 req-8fec2a98-ad8a-4398-be7d-e8bda3436898 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Received unexpected event network-vif-plugged-ec35ba7f-0da1-4858-91e5-70aaebb424d4 for instance with vm_state deleted and task_state None.
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.748 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[eec9f5b0-b9b7-4c5c-9ba9-7c6e320a8e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444908, 'reachable_time': 44145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210836, 'error': None, 'target': 'ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d2ab0e112\x2d4ca7\x2d4d63\x2d9a9b\x2d4898471ce300.mount: Deactivated successfully.
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.752 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2ab0e112-4ca7-4d63-9a9b-4898471ce300 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:35:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:35:46.755 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[59561aef-8558-4c5c-8089-6d411efa0955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.051 187227 DEBUG nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-unplugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.051 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.052 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.052 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.052 187227 DEBUG nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] No waiting events found dispatching network-vif-unplugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.052 187227 DEBUG nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-unplugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.053 187227 DEBUG nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.053 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.053 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.054 187227 DEBUG oslo_concurrency.lockutils [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.054 187227 DEBUG nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] No waiting events found dispatching network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.054 187227 WARNING nova.compute.manager [req-0d587954-09c0-4997-ad9c-7ba2018ac570 req-3dc333cb-4349-4cda-b5ef-5f43bcd54a74 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received unexpected event network-vif-plugged-b2417a6a-f805-4c80-adc3-9a9223d1c16a for instance with vm_state active and task_state deleting.
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.313 187227 DEBUG nova.network.neutron [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.350 187227 INFO nova.compute.manager [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Took 0.64 seconds to deallocate network for instance.
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.402 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.403 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.456 187227 DEBUG nova.compute.provider_tree [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.473 187227 DEBUG nova.scheduler.client.report [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.501 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.529 187227 INFO nova.scheduler.client.report [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Deleted allocations for instance f333fabf-4a60-49fb-b6dc-d0cbeb847c8f
Nov 28 17:35:47 compute-0 nova_compute[187223]: 2025-11-28 17:35:47.600 187227 DEBUG oslo_concurrency.lockutils [None req-5974f676-62c9-437d-9966-0ac310c250ea a2fe03a116c1411ebaa81bbd0334f5ed d63484fb636c435b8307abd484cb8aa7 - - default default] Lock "f333fabf-4a60-49fb-b6dc-d0cbeb847c8f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:35:48 compute-0 nova_compute[187223]: 2025-11-28 17:35:48.203 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:48 compute-0 podman[210841]: 2025-11-28 17:35:48.210795448 +0000 UTC m=+0.066584608 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:35:49 compute-0 nova_compute[187223]: 2025-11-28 17:35:49.157 187227 DEBUG nova.compute.manager [req-76cab964-bafc-47e8-838c-d5634d110352 req-ccd6891b-4b80-4640-9d09-22ac10fd6970 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Received event network-vif-deleted-b2417a6a-f805-4c80-adc3-9a9223d1c16a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:35:51 compute-0 nova_compute[187223]: 2025-11-28 17:35:51.643 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:53 compute-0 nova_compute[187223]: 2025-11-28 17:35:53.207 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:55 compute-0 podman[210865]: 2025-11-28 17:35:55.190291437 +0000 UTC m=+0.056175746 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 17:35:56 compute-0 nova_compute[187223]: 2025-11-28 17:35:56.647 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:58 compute-0 nova_compute[187223]: 2025-11-28 17:35:58.209 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:35:58 compute-0 nova_compute[187223]: 2025-11-28 17:35:58.878 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351343.8769772, e81c4034-7fd8-453d-9ece-6ce03cb4aa70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:35:58 compute-0 nova_compute[187223]: 2025-11-28 17:35:58.879 187227 INFO nova.compute.manager [-] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] VM Stopped (Lifecycle Event)
Nov 28 17:35:58 compute-0 nova_compute[187223]: 2025-11-28 17:35:58.909 187227 DEBUG nova.compute.manager [None req-9ce86e9b-dc23-4355-8072-8af2b311a06b - - - - - -] [instance: e81c4034-7fd8-453d-9ece-6ce03cb4aa70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:35:59 compute-0 podman[197556]: time="2025-11-28T17:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:35:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:35:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Nov 28 17:36:00 compute-0 podman[210885]: 2025-11-28 17:36:00.243953742 +0000 UTC m=+0.093908677 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 17:36:00 compute-0 podman[210886]: 2025-11-28 17:36:00.287549664 +0000 UTC m=+0.133072611 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true)
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: ERROR   17:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: ERROR   17:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: ERROR   17:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: ERROR   17:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: ERROR   17:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:36:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:36:01 compute-0 nova_compute[187223]: 2025-11-28 17:36:01.616 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351346.614241, f333fabf-4a60-49fb-b6dc-d0cbeb847c8f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:36:01 compute-0 nova_compute[187223]: 2025-11-28 17:36:01.616 187227 INFO nova.compute.manager [-] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] VM Stopped (Lifecycle Event)
Nov 28 17:36:01 compute-0 nova_compute[187223]: 2025-11-28 17:36:01.639 187227 DEBUG nova.compute.manager [None req-089d48df-4b62-4c3f-8633-a2c5d9ff0e2f - - - - - -] [instance: f333fabf-4a60-49fb-b6dc-d0cbeb847c8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:36:01 compute-0 nova_compute[187223]: 2025-11-28 17:36:01.648 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:03 compute-0 nova_compute[187223]: 2025-11-28 17:36:03.211 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:03 compute-0 podman[210928]: 2025-11-28 17:36:03.24156757 +0000 UTC m=+0.102671121 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7)
Nov 28 17:36:06 compute-0 nova_compute[187223]: 2025-11-28 17:36:06.651 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:08 compute-0 nova_compute[187223]: 2025-11-28 17:36:08.255 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:09 compute-0 nova_compute[187223]: 2025-11-28 17:36:09.838 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:11 compute-0 nova_compute[187223]: 2025-11-28 17:36:11.655 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:13 compute-0 nova_compute[187223]: 2025-11-28 17:36:13.257 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:16 compute-0 nova_compute[187223]: 2025-11-28 17:36:16.698 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:18 compute-0 nova_compute[187223]: 2025-11-28 17:36:18.259 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:19 compute-0 podman[210950]: 2025-11-28 17:36:19.230069271 +0000 UTC m=+0.087942605 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:36:21 compute-0 nova_compute[187223]: 2025-11-28 17:36:21.703 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:23 compute-0 nova_compute[187223]: 2025-11-28 17:36:23.262 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:26 compute-0 podman[210974]: 2025-11-28 17:36:26.217661745 +0000 UTC m=+0.079438639 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 17:36:26 compute-0 nova_compute[187223]: 2025-11-28 17:36:26.707 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:27.682 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:36:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:27.683 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:36:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:27.683 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:36:27 compute-0 nova_compute[187223]: 2025-11-28 17:36:27.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:28 compute-0 nova_compute[187223]: 2025-11-28 17:36:28.265 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:28 compute-0 nova_compute[187223]: 2025-11-28 17:36:28.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:28 compute-0 nova_compute[187223]: 2025-11-28 17:36:28.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:28 compute-0 nova_compute[187223]: 2025-11-28 17:36:28.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.713 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.714 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.714 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.714 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:36:29 compute-0 podman[197556]: time="2025-11-28T17:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:36:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:36:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.899 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.900 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5878MB free_disk=73.3452377319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.900 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.900 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.956 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.956 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.975 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.990 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:36:29 compute-0 nova_compute[187223]: 2025-11-28 17:36:29.991 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.005 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.034 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.064 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.082 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.107 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:36:30 compute-0 nova_compute[187223]: 2025-11-28 17:36:30.107 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:36:31 compute-0 podman[210997]: 2025-11-28 17:36:31.235207035 +0000 UTC m=+0.080118970 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:36:31 compute-0 podman[210998]: 2025-11-28 17:36:31.281890108 +0000 UTC m=+0.133706385 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: ERROR   17:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: ERROR   17:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: ERROR   17:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: ERROR   17:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: ERROR   17:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:36:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:36:31 compute-0 nova_compute[187223]: 2025-11-28 17:36:31.708 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.103 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.124 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.125 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.125 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.138 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.138 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:32 compute-0 nova_compute[187223]: 2025-11-28 17:36:32.138 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:33 compute-0 nova_compute[187223]: 2025-11-28 17:36:33.267 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:34 compute-0 podman[211043]: 2025-11-28 17:36:34.221266213 +0000 UTC m=+0.079979416 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 28 17:36:34 compute-0 nova_compute[187223]: 2025-11-28 17:36:34.713 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:36:36 compute-0 nova_compute[187223]: 2025-11-28 17:36:36.713 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:38 compute-0 nova_compute[187223]: 2025-11-28 17:36:38.269 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:41.182 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:36:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:41.184 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:36:41 compute-0 nova_compute[187223]: 2025-11-28 17:36:41.183 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:41 compute-0 nova_compute[187223]: 2025-11-28 17:36:41.753 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:43 compute-0 nova_compute[187223]: 2025-11-28 17:36:43.272 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:36:44.187 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:36:46 compute-0 ovn_controller[95574]: 2025-11-28T17:36:46Z|00077|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 28 17:36:46 compute-0 nova_compute[187223]: 2025-11-28 17:36:46.757 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:48 compute-0 nova_compute[187223]: 2025-11-28 17:36:48.274 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:50 compute-0 podman[211065]: 2025-11-28 17:36:50.220109463 +0000 UTC m=+0.075354412 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:36:51 compute-0 nova_compute[187223]: 2025-11-28 17:36:51.761 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:53 compute-0 nova_compute[187223]: 2025-11-28 17:36:53.276 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:56 compute-0 nova_compute[187223]: 2025-11-28 17:36:56.765 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:57 compute-0 podman[211089]: 2025-11-28 17:36:57.236605182 +0000 UTC m=+0.096722354 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 17:36:58 compute-0 nova_compute[187223]: 2025-11-28 17:36:58.323 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:36:59 compute-0 podman[197556]: time="2025-11-28T17:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:36:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:36:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: ERROR   17:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: ERROR   17:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: ERROR   17:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: ERROR   17:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: ERROR   17:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:37:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:37:01 compute-0 nova_compute[187223]: 2025-11-28 17:37:01.768 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:02 compute-0 podman[211108]: 2025-11-28 17:37:02.19766724 +0000 UTC m=+0.058737308 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:37:02 compute-0 podman[211109]: 2025-11-28 17:37:02.240819725 +0000 UTC m=+0.092220342 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:37:03 compute-0 nova_compute[187223]: 2025-11-28 17:37:03.325 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:05 compute-0 podman[211154]: 2025-11-28 17:37:05.229129304 +0000 UTC m=+0.091248443 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Nov 28 17:37:06 compute-0 nova_compute[187223]: 2025-11-28 17:37:06.771 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:08 compute-0 nova_compute[187223]: 2025-11-28 17:37:08.326 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:11 compute-0 nova_compute[187223]: 2025-11-28 17:37:11.774 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:13 compute-0 nova_compute[187223]: 2025-11-28 17:37:13.328 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:16 compute-0 nova_compute[187223]: 2025-11-28 17:37:16.777 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:18 compute-0 nova_compute[187223]: 2025-11-28 17:37:18.330 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:20 compute-0 nova_compute[187223]: 2025-11-28 17:37:20.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:20 compute-0 nova_compute[187223]: 2025-11-28 17:37:20.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:37:20 compute-0 nova_compute[187223]: 2025-11-28 17:37:20.699 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:37:21 compute-0 podman[211175]: 2025-11-28 17:37:21.202861551 +0000 UTC m=+0.062650583 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:37:21 compute-0 nova_compute[187223]: 2025-11-28 17:37:21.781 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:23 compute-0 nova_compute[187223]: 2025-11-28 17:37:23.331 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:26 compute-0 nova_compute[187223]: 2025-11-28 17:37:26.784 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:27.683 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:37:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:27.684 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:37:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:27.685 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:37:27 compute-0 nova_compute[187223]: 2025-11-28 17:37:27.700 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:28 compute-0 podman[211199]: 2025-11-28 17:37:28.205655733 +0000 UTC m=+0.061047856 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 17:37:28 compute-0 nova_compute[187223]: 2025-11-28 17:37:28.334 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:28 compute-0 nova_compute[187223]: 2025-11-28 17:37:28.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:28 compute-0 nova_compute[187223]: 2025-11-28 17:37:28.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:37:28 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.712 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:37:29 compute-0 podman[197556]: time="2025-11-28T17:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:37:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:37:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.867 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.868 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5884MB free_disk=73.34525680541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.868 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:37:29 compute-0 nova_compute[187223]: 2025-11-28 17:37:29.869 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.141 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.141 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.221 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.251 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.253 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:37:30 compute-0 nova_compute[187223]: 2025-11-28 17:37:30.253 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.255 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.256 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: ERROR   17:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: ERROR   17:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: ERROR   17:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: ERROR   17:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: ERROR   17:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:37:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.708 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.711 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.711 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:37:31 compute-0 nova_compute[187223]: 2025-11-28 17:37:31.786 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:32 compute-0 nova_compute[187223]: 2025-11-28 17:37:32.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:32 compute-0 nova_compute[187223]: 2025-11-28 17:37:32.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:33 compute-0 podman[211220]: 2025-11-28 17:37:33.206828755 +0000 UTC m=+0.068399689 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 17:37:33 compute-0 podman[211221]: 2025-11-28 17:37:33.271208817 +0000 UTC m=+0.116439726 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 17:37:33 compute-0 nova_compute[187223]: 2025-11-28 17:37:33.336 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:36 compute-0 podman[211269]: 2025-11-28 17:37:36.203128987 +0000 UTC m=+0.064630430 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc.)
Nov 28 17:37:36 compute-0 nova_compute[187223]: 2025-11-28 17:37:36.699 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:37:36 compute-0 nova_compute[187223]: 2025-11-28 17:37:36.790 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:38 compute-0 nova_compute[187223]: 2025-11-28 17:37:38.338 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:41 compute-0 nova_compute[187223]: 2025-11-28 17:37:41.793 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:43 compute-0 nova_compute[187223]: 2025-11-28 17:37:43.338 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:43.870 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:37:43 compute-0 nova_compute[187223]: 2025-11-28 17:37:43.870 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:43.871 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:37:46 compute-0 nova_compute[187223]: 2025-11-28 17:37:46.797 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:48 compute-0 nova_compute[187223]: 2025-11-28 17:37:48.339 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:51 compute-0 nova_compute[187223]: 2025-11-28 17:37:51.801 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:52 compute-0 podman[211291]: 2025-11-28 17:37:52.186234496 +0000 UTC m=+0.052659142 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:37:52 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:37:52.875 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:37:53 compute-0 nova_compute[187223]: 2025-11-28 17:37:53.342 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:56 compute-0 nova_compute[187223]: 2025-11-28 17:37:56.805 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:58 compute-0 nova_compute[187223]: 2025-11-28 17:37:58.344 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:37:59 compute-0 podman[211316]: 2025-11-28 17:37:59.20221742 +0000 UTC m=+0.059723947 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:37:59 compute-0 podman[197556]: time="2025-11-28T17:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:37:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:37:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: ERROR   17:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: ERROR   17:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: ERROR   17:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: ERROR   17:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: ERROR   17:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:38:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:38:01 compute-0 nova_compute[187223]: 2025-11-28 17:38:01.834 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:03 compute-0 nova_compute[187223]: 2025-11-28 17:38:03.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:04 compute-0 podman[211336]: 2025-11-28 17:38:04.253176581 +0000 UTC m=+0.103700486 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:38:04 compute-0 podman[211335]: 2025-11-28 17:38:04.261936896 +0000 UTC m=+0.109288259 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 17:38:06 compute-0 nova_compute[187223]: 2025-11-28 17:38:06.838 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:07 compute-0 podman[211382]: 2025-11-28 17:38:07.321076895 +0000 UTC m=+0.066311728 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 17:38:08 compute-0 nova_compute[187223]: 2025-11-28 17:38:08.350 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.136 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.137 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.160 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.260 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.261 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.270 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.271 187227 INFO nova.compute.claims [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.412 187227 DEBUG nova.compute.provider_tree [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.426 187227 DEBUG nova.scheduler.client.report [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.453 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.454 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.499 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.500 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.534 187227 INFO nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.561 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.657 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.659 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.660 187227 INFO nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Creating image(s)
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.661 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.661 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.662 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.678 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.741 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.742 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.742 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.753 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.811 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.812 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.831 187227 DEBUG nova.policy [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6ed8cc17a7c4f34b32582c250e4b754', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f8bcfa33baac4402a3841f550dae7748', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.846 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.847 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.848 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.903 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.904 187227 DEBUG nova.virt.disk.api [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Checking if we can resize image /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.904 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.960 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.961 187227 DEBUG nova.virt.disk.api [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Cannot resize image /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:38:09 compute-0 nova_compute[187223]: 2025-11-28 17:38:09.962 187227 DEBUG nova.objects.instance [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lazy-loading 'migration_context' on Instance uuid 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.226 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.226 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Ensure instance console log exists: /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.227 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.227 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.228 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:10 compute-0 nova_compute[187223]: 2025-11-28 17:38:10.967 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Successfully created port: 88207338-3bf0-499d-860e-d43ba0c80385 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:38:11 compute-0 nova_compute[187223]: 2025-11-28 17:38:11.841 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.393 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.799 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Successfully updated port: 88207338-3bf0-499d-860e-d43ba0c80385 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.828 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.828 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquired lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.828 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.923 187227 DEBUG nova.compute.manager [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-changed-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.923 187227 DEBUG nova.compute.manager [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Refreshing instance network info cache due to event network-changed-88207338-3bf0-499d-860e-d43ba0c80385. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:38:13 compute-0 nova_compute[187223]: 2025-11-28 17:38:13.923 187227 DEBUG oslo_concurrency.lockutils [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:38:14 compute-0 nova_compute[187223]: 2025-11-28 17:38:14.806 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:38:16 compute-0 nova_compute[187223]: 2025-11-28 17:38:16.845 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.108 187227 DEBUG nova.network.neutron [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.135 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Releasing lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.135 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Instance network_info: |[{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.136 187227 DEBUG oslo_concurrency.lockutils [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.136 187227 DEBUG nova.network.neutron [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Refreshing network info cache for port 88207338-3bf0-499d-860e-d43ba0c80385 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.139 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Start _get_guest_xml network_info=[{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.144 187227 WARNING nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.149 187227 DEBUG nova.virt.libvirt.host [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.150 187227 DEBUG nova.virt.libvirt.host [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.153 187227 DEBUG nova.virt.libvirt.host [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.154 187227 DEBUG nova.virt.libvirt.host [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.155 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.155 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.156 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.157 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.157 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.157 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.157 187227 DEBUG nova.virt.hardware [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.161 187227 DEBUG nova.virt.libvirt.vif [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:38:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1906783340',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1906783340',id=10,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f8bcfa33baac4402a3841f550dae7748',ramdisk_id='',reservation_id='r-ykq8pcdm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:38:09Z,user_data=None,user_id='e6ed8cc17a7c4f34b32582c250e4b754',uuid=46d07989-a2d7-4ab0-a623-3bf99e2b2b81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.161 187227 DEBUG nova.network.os_vif_util [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converting VIF {"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.162 187227 DEBUG nova.network.os_vif_util [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.163 187227 DEBUG nova.objects.instance [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lazy-loading 'pci_devices' on Instance uuid 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.178 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <uuid>46d07989-a2d7-4ab0-a623-3bf99e2b2b81</uuid>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <name>instance-0000000a</name>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1906783340</nova:name>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:38:17</nova:creationTime>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:user uuid="e6ed8cc17a7c4f34b32582c250e4b754">tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member</nova:user>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:project uuid="f8bcfa33baac4402a3841f550dae7748">tempest-TestExecuteHostMaintenanceStrategy-1463017589</nova:project>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         <nova:port uuid="88207338-3bf0-499d-860e-d43ba0c80385">
Nov 28 17:38:17 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <system>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="serial">46d07989-a2d7-4ab0-a623-3bf99e2b2b81</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="uuid">46d07989-a2d7-4ab0-a623-3bf99e2b2b81</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </system>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <os>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </os>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <features>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </features>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.config"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:ab:9f:57"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <target dev="tap88207338-3b"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/console.log" append="off"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <video>
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </video>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:38:17 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:38:17 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:38:17 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:38:17 compute-0 nova_compute[187223]: </domain>
Nov 28 17:38:17 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.180 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Preparing to wait for external event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.180 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.181 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.181 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.181 187227 DEBUG nova.virt.libvirt.vif [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:38:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1906783340',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1906783340',id=10,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f8bcfa33baac4402a3841f550dae7748',ramdisk_id='',reservation_id='r-ykq8pcdm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:38:09Z,user_data=None,user_id='e6ed8cc17a7c4f34b32582c250e4b754',uuid=46d07989-a2d7-4ab0-a623-3bf99e2b2b81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.182 187227 DEBUG nova.network.os_vif_util [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converting VIF {"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.182 187227 DEBUG nova.network.os_vif_util [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.183 187227 DEBUG os_vif [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.183 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.184 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.184 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.187 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.187 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88207338-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.188 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88207338-3b, col_values=(('external_ids', {'iface-id': '88207338-3bf0-499d-860e-d43ba0c80385', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:9f:57', 'vm-uuid': '46d07989-a2d7-4ab0-a623-3bf99e2b2b81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.189 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:17 compute-0 NetworkManager[55763]: <info>  [1764351497.1902] manager: (tap88207338-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.192 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.196 187227 INFO os_vif [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b')
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.274 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.274 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.274 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] No VIF found with MAC fa:16:3e:ab:9f:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:38:17 compute-0 nova_compute[187223]: 2025-11-28 17:38:17.275 187227 INFO nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Using config drive
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.065 187227 INFO nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Creating config drive at /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.config
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.071 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbql18alh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.199 187227 DEBUG oslo_concurrency.processutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbql18alh" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:18 compute-0 kernel: tap88207338-3b: entered promiscuous mode
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.2843] manager: (tap88207338-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.282 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.288 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 ovn_controller[95574]: 2025-11-28T17:38:18Z|00078|binding|INFO|Claiming lport 88207338-3bf0-499d-860e-d43ba0c80385 for this chassis.
Nov 28 17:38:18 compute-0 ovn_controller[95574]: 2025-11-28T17:38:18Z|00079|binding|INFO|88207338-3bf0-499d-860e-d43ba0c80385: Claiming fa:16:3e:ab:9f:57 10.100.0.5
Nov 28 17:38:18 compute-0 systemd-udevd[211438]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:38:18 compute-0 systemd-machined[153517]: New machine qemu-7-instance-0000000a.
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.3426] device (tap88207338-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.3441] device (tap88207338-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.341 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:9f:57 10.100.0.5'], port_security=['fa:16:3e:ab:9f:57 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '46d07989-a2d7-4ab0-a623-3bf99e2b2b81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=88207338-3bf0-499d-860e-d43ba0c80385) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.343 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 88207338-3bf0-499d-860e-d43ba0c80385 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 bound to our chassis
Nov 28 17:38:18 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.347 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7
Nov 28 17:38:18 compute-0 ovn_controller[95574]: 2025-11-28T17:38:18Z|00080|binding|INFO|Setting lport 88207338-3bf0-499d-860e-d43ba0c80385 ovn-installed in OVS
Nov 28 17:38:18 compute-0 ovn_controller[95574]: 2025-11-28T17:38:18Z|00081|binding|INFO|Setting lport 88207338-3bf0-499d-860e-d43ba0c80385 up in Southbound
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.354 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.364 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7cbf3b-1cc6-477d-9ecb-e63af959d08a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.366 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c78e191-f1 in ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.368 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c78e191-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.368 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb47c95-8687-4b00-8056-d4a88c1d24b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.370 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[808423bf-b636-4c30-8e92-59e41eb985e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.385 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[887dbeb4-26d5-426e-bb96-77594ed58e2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.395 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.410 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c5e8bf-ce22-4c08-a7b8-c6ec669277e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.445 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[a245fea1-c33a-4edd-ab7a-bc7aeac29ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.451 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2bc580-7202-46e9-9c71-0196f9ee237e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 systemd-udevd[211441]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.4537] manager: (tap9c78e191-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.482 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[9f01176b-d82b-4fc0-85d4-043350f84549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.485 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[abb070fd-a950-4f45-9606-536f795f0cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.5102] device (tap9c78e191-f0): carrier: link connected
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.517 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f683ef3a-9872-4e65-bbe0-c28684c239ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.538 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[dd8fdc6c-e3bc-48b5-85ae-cb3d5ebc48cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c78e191-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:f4:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467313, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211472, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.558 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0a5689-8839-45df-ad3e-13edb5026287]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:f477'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467313, 'tstamp': 467313}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211473, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.580 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[11cdf9c7-d605-4f51-8cd6-38bbbf5136f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c78e191-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:f4:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467313, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211474, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.616 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a539c9a0-e29d-4e52-9235-222da55ba34a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.687 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a13d6558-cf63-4d8d-a429-3cb70735b752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.689 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c78e191-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.689 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.690 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c78e191-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:18 compute-0 kernel: tap9c78e191-f0: entered promiscuous mode
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.692 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 NetworkManager[55763]: <info>  [1764351498.6948] manager: (tap9c78e191-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.695 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c78e191-f0, col_values=(('external_ids', {'iface-id': '6203fe5a-7b88-4f04-8376-3428766cdbb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.695 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.696 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.698 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:18 compute-0 ovn_controller[95574]: 2025-11-28T17:38:18Z|00082|binding|INFO|Releasing lport 6203fe5a-7b88-4f04-8376-3428766cdbb2 from this chassis (sb_readonly=0)
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.699 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c78e191-f6d6-4fbb-a215-3abc59437ec7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c78e191-f6d6-4fbb-a215-3abc59437ec7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.700 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[697f6ed3-bcae-4576-93b0-71a0e66d6358]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.701 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-9c78e191-f6d6-4fbb-a215-3abc59437ec7
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/9c78e191-f6d6-4fbb-a215-3abc59437ec7.pid.haproxy
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 9c78e191-f6d6-4fbb-a215-3abc59437ec7
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:38:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:18.702 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'env', 'PROCESS_TAG=haproxy-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c78e191-f6d6-4fbb-a215-3abc59437ec7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:38:18 compute-0 nova_compute[187223]: 2025-11-28 17:38:18.713 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.063 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351499.0616612, 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.065 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] VM Started (Lifecycle Event)
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.098 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.106 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351499.0618484, 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.106 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] VM Paused (Lifecycle Event)
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.119 187227 DEBUG nova.compute.manager [req-e0a7f2de-627f-4295-ab8c-e123e6499f0e req-9e7c6544-63bd-426f-83f8-def0bdc7dafa 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.119 187227 DEBUG oslo_concurrency.lockutils [req-e0a7f2de-627f-4295-ab8c-e123e6499f0e req-9e7c6544-63bd-426f-83f8-def0bdc7dafa 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.120 187227 DEBUG oslo_concurrency.lockutils [req-e0a7f2de-627f-4295-ab8c-e123e6499f0e req-9e7c6544-63bd-426f-83f8-def0bdc7dafa 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.120 187227 DEBUG oslo_concurrency.lockutils [req-e0a7f2de-627f-4295-ab8c-e123e6499f0e req-9e7c6544-63bd-426f-83f8-def0bdc7dafa 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.120 187227 DEBUG nova.compute.manager [req-e0a7f2de-627f-4295-ab8c-e123e6499f0e req-9e7c6544-63bd-426f-83f8-def0bdc7dafa 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Processing event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.121 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.126 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.129 187227 INFO nova.virt.libvirt.driver [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Instance spawned successfully.
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.131 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:38:19 compute-0 podman[211513]: 2025-11-28 17:38:19.145879171 +0000 UTC m=+0.101249805 container create 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.150 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.159 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351499.1246932, 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.160 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] VM Resumed (Lifecycle Event)
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.164 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.165 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.165 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.166 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.166 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.166 187227 DEBUG nova.virt.libvirt.driver [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:38:19 compute-0 podman[211513]: 2025-11-28 17:38:19.077726849 +0000 UTC m=+0.033097503 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.197 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.201 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:38:19 compute-0 systemd[1]: Started libpod-conmon-4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1.scope.
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.237 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:38:19 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:38:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20aed70a7077f0844d7b9f6fd136952000a5310a90befda91dad30647b5b309f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:38:19 compute-0 podman[211513]: 2025-11-28 17:38:19.26866397 +0000 UTC m=+0.224034624 container init 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 17:38:19 compute-0 podman[211513]: 2025-11-28 17:38:19.274945183 +0000 UTC m=+0.230315817 container start 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.279 187227 INFO nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Took 9.62 seconds to spawn the instance on the hypervisor.
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.280 187227 DEBUG nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:38:19 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [NOTICE]   (211533) : New worker (211535) forked
Nov 28 17:38:19 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [NOTICE]   (211533) : Loading success.
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.413 187227 INFO nova.compute.manager [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Took 10.20 seconds to build instance.
Nov 28 17:38:19 compute-0 nova_compute[187223]: 2025-11-28 17:38:19.473 187227 DEBUG oslo_concurrency.lockutils [None req-c5589a37-c446-42c2-a565-1aa6b5a58351 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:20 compute-0 nova_compute[187223]: 2025-11-28 17:38:20.843 187227 DEBUG nova.network.neutron [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updated VIF entry in instance network info cache for port 88207338-3bf0-499d-860e-d43ba0c80385. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:38:20 compute-0 nova_compute[187223]: 2025-11-28 17:38:20.845 187227 DEBUG nova.network.neutron [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.191 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.381 187227 DEBUG nova.compute.manager [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.382 187227 DEBUG oslo_concurrency.lockutils [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.382 187227 DEBUG oslo_concurrency.lockutils [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.383 187227 DEBUG oslo_concurrency.lockutils [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.383 187227 DEBUG nova.compute.manager [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] No waiting events found dispatching network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.383 187227 WARNING nova.compute.manager [req-eff61360-d570-431f-a1ab-9be5cd815bbe req-eb8d3bec-563b-4e74-addc-5e9575ac9980 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received unexpected event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 for instance with vm_state active and task_state None.
Nov 28 17:38:22 compute-0 nova_compute[187223]: 2025-11-28 17:38:22.385 187227 DEBUG oslo_concurrency.lockutils [req-96e255bb-3658-4713-a5d9-1b99eb45d508 req-6019d429-4338-4bef-b2f6-a2ebdb8e6c8d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:38:23 compute-0 podman[211544]: 2025-11-28 17:38:23.210078247 +0000 UTC m=+0.063982911 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:38:23 compute-0 nova_compute[187223]: 2025-11-28 17:38:23.397 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:27 compute-0 nova_compute[187223]: 2025-11-28 17:38:27.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:27.685 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:27.685 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:38:27.686 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:28 compute-0 nova_compute[187223]: 2025-11-28 17:38:28.399 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:29 compute-0 nova_compute[187223]: 2025-11-28 17:38:29.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:29 compute-0 nova_compute[187223]: 2025-11-28 17:38:29.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:29 compute-0 nova_compute[187223]: 2025-11-28 17:38:29.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:38:29 compute-0 podman[197556]: time="2025-11-28T17:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:38:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:38:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Nov 28 17:38:30 compute-0 podman[211569]: 2025-11-28 17:38:30.201993942 +0000 UTC m=+0.062071256 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: ERROR   17:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: ERROR   17:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: ERROR   17:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: ERROR   17:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: ERROR   17:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:38:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.755 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.756 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.779 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.779 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.779 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.779 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.859 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.930 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.932 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:38:31 compute-0 nova_compute[187223]: 2025-11-28 17:38:31.989 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.172 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.174 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.3137435913086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.175 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.175 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.199 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.260 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.261 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.261 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.306 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.324 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.344 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:38:32 compute-0 nova_compute[187223]: 2025-11-28 17:38:32.345 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:38:32 compute-0 ovn_controller[95574]: 2025-11-28T17:38:32Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:9f:57 10.100.0.5
Nov 28 17:38:32 compute-0 ovn_controller[95574]: 2025-11-28T17:38:32Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:9f:57 10.100.0.5
Nov 28 17:38:33 compute-0 nova_compute[187223]: 2025-11-28 17:38:33.273 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:33 compute-0 nova_compute[187223]: 2025-11-28 17:38:33.401 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:33 compute-0 nova_compute[187223]: 2025-11-28 17:38:33.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:33 compute-0 nova_compute[187223]: 2025-11-28 17:38:33.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:38:33 compute-0 nova_compute[187223]: 2025-11-28 17:38:33.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:38:34 compute-0 nova_compute[187223]: 2025-11-28 17:38:34.803 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:38:34 compute-0 nova_compute[187223]: 2025-11-28 17:38:34.804 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:38:34 compute-0 nova_compute[187223]: 2025-11-28 17:38:34.804 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:38:34 compute-0 nova_compute[187223]: 2025-11-28 17:38:34.804 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:38:35 compute-0 podman[211616]: 2025-11-28 17:38:35.217834712 +0000 UTC m=+0.077510114 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:38:35 compute-0 podman[211617]: 2025-11-28 17:38:35.241951624 +0000 UTC m=+0.098686230 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 17:38:37 compute-0 nova_compute[187223]: 2025-11-28 17:38:37.202 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:38 compute-0 podman[211662]: 2025-11-28 17:38:38.207544793 +0000 UTC m=+0.067029429 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64)
Nov 28 17:38:38 compute-0 nova_compute[187223]: 2025-11-28 17:38:38.405 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:39 compute-0 nova_compute[187223]: 2025-11-28 17:38:39.826 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:38:39 compute-0 nova_compute[187223]: 2025-11-28 17:38:39.875 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:38:39 compute-0 nova_compute[187223]: 2025-11-28 17:38:39.876 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:38:39 compute-0 nova_compute[187223]: 2025-11-28 17:38:39.876 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:39 compute-0 nova_compute[187223]: 2025-11-28 17:38:39.876 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:42 compute-0 nova_compute[187223]: 2025-11-28 17:38:42.208 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:43 compute-0 nova_compute[187223]: 2025-11-28 17:38:43.409 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:43 compute-0 nova_compute[187223]: 2025-11-28 17:38:43.871 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:38:47 compute-0 nova_compute[187223]: 2025-11-28 17:38:47.214 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:48 compute-0 nova_compute[187223]: 2025-11-28 17:38:48.411 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:48 compute-0 ovn_controller[95574]: 2025-11-28T17:38:48Z|00083|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 28 17:38:52 compute-0 nova_compute[187223]: 2025-11-28 17:38:52.218 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:53 compute-0 nova_compute[187223]: 2025-11-28 17:38:53.412 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:54 compute-0 podman[211684]: 2025-11-28 17:38:54.236891755 +0000 UTC m=+0.092457159 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:38:57 compute-0 nova_compute[187223]: 2025-11-28 17:38:57.221 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:58 compute-0 nova_compute[187223]: 2025-11-28 17:38:58.415 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:38:59 compute-0 podman[197556]: time="2025-11-28T17:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:38:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:38:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 28 17:39:01 compute-0 podman[211708]: 2025-11-28 17:39:01.222488863 +0000 UTC m=+0.079921294 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: ERROR   17:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: ERROR   17:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: ERROR   17:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: ERROR   17:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: ERROR   17:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:39:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:39:02 compute-0 nova_compute[187223]: 2025-11-28 17:39:02.224 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:03 compute-0 nova_compute[187223]: 2025-11-28 17:39:03.417 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:06 compute-0 podman[211727]: 2025-11-28 17:39:06.244727063 +0000 UTC m=+0.104837718 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:39:06 compute-0 podman[211728]: 2025-11-28 17:39:06.24910424 +0000 UTC m=+0.106282230 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:39:07 compute-0 nova_compute[187223]: 2025-11-28 17:39:07.228 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:08 compute-0 nova_compute[187223]: 2025-11-28 17:39:08.419 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:09 compute-0 podman[211770]: 2025-11-28 17:39:09.206963005 +0000 UTC m=+0.067444981 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 28 17:39:12 compute-0 nova_compute[187223]: 2025-11-28 17:39:12.232 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:13 compute-0 nova_compute[187223]: 2025-11-28 17:39:13.421 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:17 compute-0 nova_compute[187223]: 2025-11-28 17:39:17.258 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:18 compute-0 nova_compute[187223]: 2025-11-28 17:39:18.465 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:22 compute-0 nova_compute[187223]: 2025-11-28 17:39:22.262 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:23 compute-0 nova_compute[187223]: 2025-11-28 17:39:23.467 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:24 compute-0 nova_compute[187223]: 2025-11-28 17:39:24.803 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Creating tmpfile /var/lib/nova/instances/tmppe8ihel2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:39:24 compute-0 nova_compute[187223]: 2025-11-28 17:39:24.804 187227 DEBUG nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe8ihel2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:39:25 compute-0 podman[211793]: 2025-11-28 17:39:25.210601179 +0000 UTC m=+0.056272366 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:39:27 compute-0 nova_compute[187223]: 2025-11-28 17:39:27.267 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:27.686 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:27.687 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:27.687 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:28 compute-0 nova_compute[187223]: 2025-11-28 17:39:28.428 187227 DEBUG nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe8ihel2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4103b83a-8bcb-41ce-8044-ae4574ed2c4a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:39:28 compute-0 nova_compute[187223]: 2025-11-28 17:39:28.462 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:39:28 compute-0 nova_compute[187223]: 2025-11-28 17:39:28.463 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:39:28 compute-0 nova_compute[187223]: 2025-11-28 17:39:28.463 187227 DEBUG nova.network.neutron [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:39:28 compute-0 nova_compute[187223]: 2025-11-28 17:39:28.469 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:29 compute-0 podman[197556]: time="2025-11-28T17:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:39:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:39:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.375 187227 DEBUG nova.network.neutron [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Updating instance_info_cache with network_info: [{"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.406 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.408 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe8ihel2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4103b83a-8bcb-41ce-8044-ae4574ed2c4a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.409 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Creating instance directory: /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.410 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Creating disk.info with the contents: {'/var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk': 'qcow2', '/var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.410 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.411 187227 DEBUG nova.objects.instance [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4103b83a-8bcb-41ce-8044-ae4574ed2c4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.437 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.497 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.499 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.500 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.511 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.573 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.574 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.613 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.615 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.616 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.697 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.699 187227 DEBUG nova.virt.disk.api [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.701 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.761 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.762 187227 DEBUG nova.virt.disk.api [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.762 187227 DEBUG nova.objects.instance [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 4103b83a-8bcb-41ce-8044-ae4574ed2c4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.779 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.806 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.810 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config to /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:39:30 compute-0 nova_compute[187223]: 2025-11-28 17:39:30.810 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.332 187227 DEBUG oslo_concurrency.processutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a/disk.config /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.333 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.335 187227 DEBUG nova.virt.libvirt.vif [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:37:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-893617996',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-893617996',id=9,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:38:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f8bcfa33baac4402a3841f550dae7748',ramdisk_id='',reservation_id='r-vyilu9si',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:38:01Z,user_data=None,user_id='e6ed8cc17a7c4f34b32582c250e4b754',uuid=4103b83a-8bcb-41ce-8044-ae4574ed2c4a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.335 187227 DEBUG nova.network.os_vif_util [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.336 187227 DEBUG nova.network.os_vif_util [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.336 187227 DEBUG os_vif [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.337 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.337 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.338 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.342 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.342 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd191711f-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.343 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd191711f-58, col_values=(('external_ids', {'iface-id': 'd191711f-581c-4ac9-8d1f-337f340a2713', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:14:c7', 'vm-uuid': '4103b83a-8bcb-41ce-8044-ae4574ed2c4a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.344 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:31 compute-0 NetworkManager[55763]: <info>  [1764351571.3458] manager: (tapd191711f-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.348 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.352 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.353 187227 INFO os_vif [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58')
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.354 187227 DEBUG nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.354 187227 DEBUG nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe8ihel2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4103b83a-8bcb-41ce-8044-ae4574ed2c4a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: ERROR   17:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: ERROR   17:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: ERROR   17:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: ERROR   17:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: ERROR   17:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:39:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.711 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.774 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.842 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.844 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:39:31 compute-0 nova_compute[187223]: 2025-11-28 17:39:31.913 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.090 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.093 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.3113899230957GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.093 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.145 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Migration for instance 4103b83a-8bcb-41ce-8044-ae4574ed2c4a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.167 187227 INFO nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Updating resource usage from migration 97617701-8142-43b8-b4ec-3ec50d586959
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.168 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Starting to track incoming migration 97617701-8142-43b8-b4ec-3ec50d586959 with flavor 6f44bded-bdbe-4623-9c87-afc5919e8381 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Nov 28 17:39:32 compute-0 podman[211845]: 2025-11-28 17:39:32.206030153 +0000 UTC m=+0.061352014 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.206 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.224 187227 WARNING nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 4103b83a-8bcb-41ce-8044-ae4574ed2c4a has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.225 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.225 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.313 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.338 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.361 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:32.361 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:39:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:32.363 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.366 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.366 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.989 187227 DEBUG nova.network.neutron [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Port d191711f-581c-4ac9-8d1f-337f340a2713 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:39:32 compute-0 nova_compute[187223]: 2025-11-28 17:39:32.991 187227 DEBUG nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe8ihel2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4103b83a-8bcb-41ce-8044-ae4574ed2c4a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:39:33 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:39:33 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:39:33 compute-0 kernel: tapd191711f-58: entered promiscuous mode
Nov 28 17:39:33 compute-0 NetworkManager[55763]: <info>  [1764351573.2767] manager: (tapd191711f-58): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Nov 28 17:39:33 compute-0 ovn_controller[95574]: 2025-11-28T17:39:33Z|00084|binding|INFO|Claiming lport d191711f-581c-4ac9-8d1f-337f340a2713 for this additional chassis.
Nov 28 17:39:33 compute-0 ovn_controller[95574]: 2025-11-28T17:39:33Z|00085|binding|INFO|d191711f-581c-4ac9-8d1f-337f340a2713: Claiming fa:16:3e:f9:14:c7 10.100.0.9
Nov 28 17:39:33 compute-0 nova_compute[187223]: 2025-11-28 17:39:33.277 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:33 compute-0 ovn_controller[95574]: 2025-11-28T17:39:33Z|00086|binding|INFO|Setting lport d191711f-581c-4ac9-8d1f-337f340a2713 ovn-installed in OVS
Nov 28 17:39:33 compute-0 nova_compute[187223]: 2025-11-28 17:39:33.289 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:33 compute-0 systemd-udevd[211897]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:39:33 compute-0 systemd-machined[153517]: New machine qemu-8-instance-00000009.
Nov 28 17:39:33 compute-0 NetworkManager[55763]: <info>  [1764351573.3266] device (tapd191711f-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:39:33 compute-0 NetworkManager[55763]: <info>  [1764351573.3276] device (tapd191711f-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:39:33 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000009.
Nov 28 17:39:33 compute-0 nova_compute[187223]: 2025-11-28 17:39:33.471 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.096 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351575.0954716, 4103b83a-8bcb-41ce-8044-ae4574ed2c4a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.097 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] VM Started (Lifecycle Event)
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.126 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.367 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.367 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.368 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.847 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.847 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.848 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.849 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.919 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351575.91887, 4103b83a-8bcb-41ce-8044-ae4574ed2c4a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.920 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] VM Resumed (Lifecycle Event)
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.957 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.962 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:39:35 compute-0 nova_compute[187223]: 2025-11-28 17:39:35.984 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:39:36 compute-0 nova_compute[187223]: 2025-11-28 17:39:36.346 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:36 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:36.366 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:37 compute-0 podman[211931]: 2025-11-28 17:39:37.228384797 +0000 UTC m=+0.081742846 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 17:39:37 compute-0 podman[211932]: 2025-11-28 17:39:37.2508313 +0000 UTC m=+0.104423756 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:39:38 compute-0 ovn_controller[95574]: 2025-11-28T17:39:38Z|00087|binding|INFO|Claiming lport d191711f-581c-4ac9-8d1f-337f340a2713 for this chassis.
Nov 28 17:39:38 compute-0 ovn_controller[95574]: 2025-11-28T17:39:38Z|00088|binding|INFO|d191711f-581c-4ac9-8d1f-337f340a2713: Claiming fa:16:3e:f9:14:c7 10.100.0.9
Nov 28 17:39:38 compute-0 ovn_controller[95574]: 2025-11-28T17:39:38Z|00089|binding|INFO|Setting lport d191711f-581c-4ac9-8d1f-337f340a2713 up in Southbound
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.420 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:14:c7 10.100.0.9'], port_security=['fa:16:3e:f9:14:c7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4103b83a-8bcb-41ce-8044-ae4574ed2c4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '11', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d191711f-581c-4ac9-8d1f-337f340a2713) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.422 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d191711f-581c-4ac9-8d1f-337f340a2713 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 bound to our chassis
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.424 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.445 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0214224f-f828-4cb7-bc5b-9a6a68679764]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 nova_compute[187223]: 2025-11-28 17:39:38.473 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.480 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7ac856-d203-4ba7-b319-f7e05c23b410]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.484 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c9149d9c-c0b9-4849-b06d-263ffa229b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.518 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[90cc6e96-d645-4da5-9f62-baa2ceae09d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.546 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8ba7ea-fdac-4a19-ae94-f50660f801ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c78e191-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:f4:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467313, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211982, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.569 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7046a956-4359-4e7e-beda-daf7def4af4c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c78e191-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467327, 'tstamp': 467327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211983, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c78e191-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467331, 'tstamp': 467331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211983, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.571 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c78e191-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:38 compute-0 nova_compute[187223]: 2025-11-28 17:39:38.573 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:38 compute-0 nova_compute[187223]: 2025-11-28 17:39:38.574 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.574 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c78e191-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.575 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.575 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c78e191-f0, col_values=(('external_ids', {'iface-id': '6203fe5a-7b88-4f04-8376-3428766cdbb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:38.575 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:39:38 compute-0 nova_compute[187223]: 2025-11-28 17:39:38.686 187227 INFO nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Post operation of migration started
Nov 28 17:39:38 compute-0 nova_compute[187223]: 2025-11-28 17:39:38.934 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [{"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.071 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.072 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.072 187227 DEBUG nova.network.neutron [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.136 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-46d07989-a2d7-4ab0-a623-3bf99e2b2b81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.137 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.137 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.139 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:40 compute-0 nova_compute[187223]: 2025-11-28 17:39:40.139 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:40 compute-0 podman[211984]: 2025-11-28 17:39:40.214007121 +0000 UTC m=+0.064278000 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git)
Nov 28 17:39:41 compute-0 nova_compute[187223]: 2025-11-28 17:39:41.402 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:42 compute-0 nova_compute[187223]: 2025-11-28 17:39:42.438 187227 DEBUG nova.network.neutron [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Updating instance_info_cache with network_info: [{"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:39:43 compute-0 nova_compute[187223]: 2025-11-28 17:39:43.475 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:44 compute-0 nova_compute[187223]: 2025-11-28 17:39:44.452 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:39:45 compute-0 nova_compute[187223]: 2025-11-28 17:39:45.237 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-4103b83a-8bcb-41ce-8044-ae4574ed2c4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:39:45 compute-0 nova_compute[187223]: 2025-11-28 17:39:45.279 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:45 compute-0 nova_compute[187223]: 2025-11-28 17:39:45.280 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:45 compute-0 nova_compute[187223]: 2025-11-28 17:39:45.280 187227 DEBUG oslo_concurrency.lockutils [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:45 compute-0 nova_compute[187223]: 2025-11-28 17:39:45.286 187227 INFO nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:39:45 compute-0 virtqemud[186845]: Domain id=8 name='instance-00000009' uuid=4103b83a-8bcb-41ce-8044-ae4574ed2c4a is tainted: custom-monitor
Nov 28 17:39:46 compute-0 nova_compute[187223]: 2025-11-28 17:39:46.295 187227 INFO nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:39:46 compute-0 nova_compute[187223]: 2025-11-28 17:39:46.406 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:47 compute-0 nova_compute[187223]: 2025-11-28 17:39:47.301 187227 INFO nova.virt.libvirt.driver [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:39:47 compute-0 nova_compute[187223]: 2025-11-28 17:39:47.309 187227 DEBUG nova.compute.manager [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:39:48 compute-0 nova_compute[187223]: 2025-11-28 17:39:48.478 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:48 compute-0 nova_compute[187223]: 2025-11-28 17:39:48.501 187227 DEBUG nova.objects.instance [None req-1b9c54dc-f0a7-431d-8b4f-b10b5885ac75 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:39:51 compute-0 nova_compute[187223]: 2025-11-28 17:39:51.450 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:53 compute-0 nova_compute[187223]: 2025-11-28 17:39:53.480 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.880 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.880 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.881 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.882 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.882 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.883 187227 INFO nova.compute.manager [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Terminating instance
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.884 187227 DEBUG nova.compute.manager [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:39:54 compute-0 kernel: tap88207338-3b (unregistering): left promiscuous mode
Nov 28 17:39:54 compute-0 NetworkManager[55763]: <info>  [1764351594.9078] device (tap88207338-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.919 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:54 compute-0 ovn_controller[95574]: 2025-11-28T17:39:54Z|00090|binding|INFO|Releasing lport 88207338-3bf0-499d-860e-d43ba0c80385 from this chassis (sb_readonly=0)
Nov 28 17:39:54 compute-0 ovn_controller[95574]: 2025-11-28T17:39:54Z|00091|binding|INFO|Setting lport 88207338-3bf0-499d-860e-d43ba0c80385 down in Southbound
Nov 28 17:39:54 compute-0 ovn_controller[95574]: 2025-11-28T17:39:54Z|00092|binding|INFO|Removing iface tap88207338-3b ovn-installed in OVS
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.926 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:54 compute-0 nova_compute[187223]: 2025-11-28 17:39:54.934 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:54.936 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:9f:57 10.100.0.5'], port_security=['fa:16:3e:ab:9f:57 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '46d07989-a2d7-4ab0-a623-3bf99e2b2b81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=88207338-3bf0-499d-860e-d43ba0c80385) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:39:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:54.939 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 88207338-3bf0-499d-860e-d43ba0c80385 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 unbound from our chassis
Nov 28 17:39:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:54.941 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7
Nov 28 17:39:54 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 28 17:39:54 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 17.161s CPU time.
Nov 28 17:39:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:54.958 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9b7f2c-d49d-4642-9248-1bc7f55ebb3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:54 compute-0 systemd-machined[153517]: Machine qemu-7-instance-0000000a terminated.
Nov 28 17:39:54 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:54.997 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[4d705fe4-bf6b-4e87-b095-e487ac58f8b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.001 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b6e78e-8a80-464d-be31-97a7692163c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.028 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c1162d78-de18-4ccb-ae8f-542bd1765c1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.051 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5c29e292-3a16-4b0a-b690-1dcc76771f92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c78e191-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:f4:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467313, 'reachable_time': 31539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212020, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.071 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c88f5a6d-6e0f-4373-9bef-e801b425f6bf]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9c78e191-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467327, 'tstamp': 467327}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212021, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9c78e191-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467331, 'tstamp': 467331}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212021, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.074 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c78e191-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:55 compute-0 nova_compute[187223]: 2025-11-28 17:39:55.077 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:55 compute-0 nova_compute[187223]: 2025-11-28 17:39:55.082 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.082 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c78e191-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.083 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.083 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c78e191-f0, col_values=(('external_ids', {'iface-id': '6203fe5a-7b88-4f04-8376-3428766cdbb2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:39:55.084 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:39:55 compute-0 nova_compute[187223]: 2025-11-28 17:39:55.144 187227 INFO nova.virt.libvirt.driver [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Instance destroyed successfully.
Nov 28 17:39:55 compute-0 nova_compute[187223]: 2025-11-28 17:39:55.145 187227 DEBUG nova.objects.instance [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lazy-loading 'resources' on Instance uuid 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:39:56 compute-0 podman[212040]: 2025-11-28 17:39:56.216122088 +0000 UTC m=+0.069016397 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.453 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.944 187227 DEBUG nova.virt.libvirt.vif [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:38:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1906783340',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1906783340',id=10,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:38:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f8bcfa33baac4402a3841f550dae7748',ramdisk_id='',reservation_id='r-ykq8pcdm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:38:19Z,user_data=None,user_id='e6ed8cc17a7c4f34b32582c250e4b754',uuid=46d07989-a2d7-4ab0-a623-3bf99e2b2b81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.945 187227 DEBUG nova.network.os_vif_util [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converting VIF {"id": "88207338-3bf0-499d-860e-d43ba0c80385", "address": "fa:16:3e:ab:9f:57", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88207338-3b", "ovs_interfaceid": "88207338-3bf0-499d-860e-d43ba0c80385", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.945 187227 DEBUG nova.network.os_vif_util [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.946 187227 DEBUG os_vif [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.948 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.949 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88207338-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.950 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.952 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.955 187227 INFO os_vif [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:9f:57,bridge_name='br-int',has_traffic_filtering=True,id=88207338-3bf0-499d-860e-d43ba0c80385,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88207338-3b')
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.956 187227 INFO nova.virt.libvirt.driver [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Deleting instance files /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81_del
Nov 28 17:39:56 compute-0 nova_compute[187223]: 2025-11-28 17:39:56.957 187227 INFO nova.virt.libvirt.driver [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Deletion of /var/lib/nova/instances/46d07989-a2d7-4ab0-a623-3bf99e2b2b81_del complete
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.025 187227 INFO nova.compute.manager [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Took 2.14 seconds to destroy the instance on the hypervisor.
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.026 187227 DEBUG oslo.service.loopingcall [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.026 187227 DEBUG nova.compute.manager [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.027 187227 DEBUG nova.network.neutron [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.164 187227 DEBUG nova.compute.manager [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-unplugged-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.164 187227 DEBUG oslo_concurrency.lockutils [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.165 187227 DEBUG oslo_concurrency.lockutils [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.165 187227 DEBUG oslo_concurrency.lockutils [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.165 187227 DEBUG nova.compute.manager [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] No waiting events found dispatching network-vif-unplugged-88207338-3bf0-499d-860e-d43ba0c80385 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:39:57 compute-0 nova_compute[187223]: 2025-11-28 17:39:57.166 187227 DEBUG nova.compute.manager [req-ec0e1d04-b01a-4570-8d68-0275e1515641 req-9a084b17-755e-4f4b-ac19-a745fb1f8751 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-unplugged-88207338-3bf0-499d-860e-d43ba0c80385 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:39:58 compute-0 nova_compute[187223]: 2025-11-28 17:39:58.483 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:39:59 compute-0 podman[197556]: time="2025-11-28T17:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:39:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:39:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3053 "" "Go-http-client/1.1"
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.022 187227 DEBUG nova.network.neutron [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.064 187227 DEBUG nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.064 187227 DEBUG oslo_concurrency.lockutils [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.065 187227 DEBUG oslo_concurrency.lockutils [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.065 187227 DEBUG oslo_concurrency.lockutils [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.066 187227 DEBUG nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] No waiting events found dispatching network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.066 187227 WARNING nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received unexpected event network-vif-plugged-88207338-3bf0-499d-860e-d43ba0c80385 for instance with vm_state active and task_state deleting.
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.066 187227 DEBUG nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Received event network-vif-deleted-88207338-3bf0-499d-860e-d43ba0c80385 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.066 187227 INFO nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Neutron deleted interface 88207338-3bf0-499d-860e-d43ba0c80385; detaching it from the instance and deleting it from the info cache
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.067 187227 DEBUG nova.network.neutron [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.069 187227 INFO nova.compute.manager [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Took 4.04 seconds to deallocate network for instance.
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.103 187227 DEBUG nova.compute.manager [req-c385fef9-ab73-46c4-90c1-dc9b1405b786 req-d2e875ac-7a25-4e5d-881d-2f66fb84ddc7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Detach interface failed, port_id=88207338-3bf0-499d-860e-d43ba0c80385, reason: Instance 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.131 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.132 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.213 187227 DEBUG nova.compute.provider_tree [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.234 187227 DEBUG nova.scheduler.client.report [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.269 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.297 187227 INFO nova.scheduler.client.report [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Deleted allocations for instance 46d07989-a2d7-4ab0-a623-3bf99e2b2b81
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.399 187227 DEBUG oslo_concurrency.lockutils [None req-f2c4187f-c7bd-4ef8-b7a5-248bf77637d2 e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "46d07989-a2d7-4ab0-a623-3bf99e2b2b81" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: ERROR   17:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: ERROR   17:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: ERROR   17:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: ERROR   17:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: ERROR   17:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:40:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:40:01 compute-0 nova_compute[187223]: 2025-11-28 17:40:01.952 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.448 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.449 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.449 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.449 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.450 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.451 187227 INFO nova.compute.manager [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Terminating instance
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.451 187227 DEBUG nova.compute.manager [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:40:02 compute-0 kernel: tapd191711f-58 (unregistering): left promiscuous mode
Nov 28 17:40:02 compute-0 NetworkManager[55763]: <info>  [1764351602.4771] device (tapd191711f-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.489 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00093|binding|INFO|Releasing lport d191711f-581c-4ac9-8d1f-337f340a2713 from this chassis (sb_readonly=0)
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00094|binding|INFO|Setting lport d191711f-581c-4ac9-8d1f-337f340a2713 down in Southbound
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00095|binding|INFO|Removing iface tapd191711f-58 ovn-installed in OVS
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.492 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.502 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:14:c7 10.100.0.9'], port_security=['fa:16:3e:f9:14:c7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4103b83a-8bcb-41ce-8044-ae4574ed2c4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '13', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d191711f-581c-4ac9-8d1f-337f340a2713) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.504 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d191711f-581c-4ac9-8d1f-337f340a2713 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 unbound from our chassis
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.505 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.507 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c25a9cf0-1fe8-4a2a-b38b-31eb316c1162]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.507 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7 namespace which is not needed anymore
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.515 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 28 17:40:02 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Consumed 3.584s CPU time.
Nov 28 17:40:02 compute-0 systemd-machined[153517]: Machine qemu-8-instance-00000009 terminated.
Nov 28 17:40:02 compute-0 podman[212064]: 2025-11-28 17:40:02.572996382 +0000 UTC m=+0.085930338 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:40:02 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [NOTICE]   (211533) : haproxy version is 2.8.14-c23fe91
Nov 28 17:40:02 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [NOTICE]   (211533) : path to executable is /usr/sbin/haproxy
Nov 28 17:40:02 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [WARNING]  (211533) : Exiting Master process...
Nov 28 17:40:02 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [ALERT]    (211533) : Current worker (211535) exited with code 143 (Terminated)
Nov 28 17:40:02 compute-0 neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7[211529]: [WARNING]  (211533) : All workers exited. Exiting... (0)
Nov 28 17:40:02 compute-0 systemd[1]: libpod-4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1.scope: Deactivated successfully.
Nov 28 17:40:02 compute-0 podman[212108]: 2025-11-28 17:40:02.660730253 +0000 UTC m=+0.052472367 container died 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:40:02 compute-0 kernel: tapd191711f-58: entered promiscuous mode
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00096|binding|INFO|Claiming lport d191711f-581c-4ac9-8d1f-337f340a2713 for this chassis.
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.718 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00097|binding|INFO|d191711f-581c-4ac9-8d1f-337f340a2713: Claiming fa:16:3e:f9:14:c7 10.100.0.9
Nov 28 17:40:02 compute-0 kernel: tapd191711f-58 (unregistering): left promiscuous mode
Nov 28 17:40:02 compute-0 NetworkManager[55763]: <info>  [1764351602.7237] manager: (tapd191711f-58): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 28 17:40:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1-userdata-shm.mount: Deactivated successfully.
Nov 28 17:40:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-20aed70a7077f0844d7b9f6fd136952000a5310a90befda91dad30647b5b309f-merged.mount: Deactivated successfully.
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00098|binding|INFO|Setting lport d191711f-581c-4ac9-8d1f-337f340a2713 ovn-installed in OVS
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00099|if_status|INFO|Not setting lport d191711f-581c-4ac9-8d1f-337f340a2713 down as sb is readonly
Nov 28 17:40:02 compute-0 podman[212108]: 2025-11-28 17:40:02.748071381 +0000 UTC m=+0.139813485 container cleanup 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.745 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 systemd[1]: libpod-conmon-4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1.scope: Deactivated successfully.
Nov 28 17:40:02 compute-0 ovn_controller[95574]: 2025-11-28T17:40:02Z|00100|binding|INFO|Releasing lport d191711f-581c-4ac9-8d1f-337f340a2713 from this chassis (sb_readonly=0)
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.771 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:14:c7 10.100.0.9'], port_security=['fa:16:3e:f9:14:c7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4103b83a-8bcb-41ce-8044-ae4574ed2c4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '13', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d191711f-581c-4ac9-8d1f-337f340a2713) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.776 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:14:c7 10.100.0.9'], port_security=['fa:16:3e:f9:14:c7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4103b83a-8bcb-41ce-8044-ae4574ed2c4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f8bcfa33baac4402a3841f550dae7748', 'neutron:revision_number': '13', 'neutron:security_group_ids': '4075dbba-04aa-4b93-a4f6-13da6a2d4e2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a780d48-7811-4465-9b47-456ebf0c9522, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d191711f-581c-4ac9-8d1f-337f340a2713) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.785 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.789 187227 INFO nova.virt.libvirt.driver [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Instance destroyed successfully.
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.790 187227 DEBUG nova.objects.instance [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lazy-loading 'resources' on Instance uuid 4103b83a-8bcb-41ce-8044-ae4574ed2c4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.812 187227 DEBUG nova.virt.libvirt.vif [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:37:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-893617996',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-893617996',id=9,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:38:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f8bcfa33baac4402a3841f550dae7748',ramdisk_id='',reservation_id='r-vyilu9si',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1463017589-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:39:48Z,user_data=None,user_id='e6ed8cc17a7c4f34b32582c250e4b754',uuid=4103b83a-8bcb-41ce-8044-ae4574ed2c4a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.813 187227 DEBUG nova.network.os_vif_util [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converting VIF {"id": "d191711f-581c-4ac9-8d1f-337f340a2713", "address": "fa:16:3e:f9:14:c7", "network": {"id": "9c78e191-f6d6-4fbb-a215-3abc59437ec7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-222589930-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f8bcfa33baac4402a3841f550dae7748", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd191711f-58", "ovs_interfaceid": "d191711f-581c-4ac9-8d1f-337f340a2713", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.813 187227 DEBUG nova.network.os_vif_util [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.814 187227 DEBUG os_vif [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.817 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.817 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd191711f-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.819 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.821 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.823 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.825 187227 INFO os_vif [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:14:c7,bridge_name='br-int',has_traffic_filtering=True,id=d191711f-581c-4ac9-8d1f-337f340a2713,network=Network(9c78e191-f6d6-4fbb-a215-3abc59437ec7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd191711f-58')
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.827 187227 INFO nova.virt.libvirt.driver [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Deleting instance files /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a_del
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.827 187227 INFO nova.virt.libvirt.driver [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Deletion of /var/lib/nova/instances/4103b83a-8bcb-41ce-8044-ae4574ed2c4a_del complete
Nov 28 17:40:02 compute-0 podman[212146]: 2025-11-28 17:40:02.83332697 +0000 UTC m=+0.058174182 container remove 4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.839 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6eea37-4cba-44dc-8ddf-7ed13e8e728d]: (4, ('Fri Nov 28 05:40:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7 (4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1)\n4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1\nFri Nov 28 05:40:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7 (4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1)\n4a8ad07718805aefa8e0015bf567ea1f54a4d968d2b8ccc9bbe63b483e212db1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.842 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4eafb286-74f5-46c2-a776-c8b0b3b99dec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.843 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c78e191-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.844 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 kernel: tap9c78e191-f0: left promiscuous mode
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.857 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.859 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.862 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e597b1ea-41c3-4a30-a833-c9786a70819c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.877 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[28b5b675-bed4-4fb9-9ff6-e99ac346446f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.878 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9274c0-daae-487e-9ab2-23038c92ba50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.879 187227 INFO nova.compute.manager [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Took 0.43 seconds to destroy the instance on the hypervisor.
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.880 187227 DEBUG oslo.service.loopingcall [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.880 187227 DEBUG nova.compute.manager [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:40:02 compute-0 nova_compute[187223]: 2025-11-28 17:40:02.880 187227 DEBUG nova.network.neutron [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.899 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[08fe84a4-cf9a-46cd-b6e5-37876623b0d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467306, 'reachable_time': 35640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212165, 'error': None, 'target': 'ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.903 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c78e191-f6d6-4fbb-a215-3abc59437ec7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.904 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e798c7-db35-4ece-b096-3f61ba3bb32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.905 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d191711f-581c-4ac9-8d1f-337f340a2713 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 unbound from our chassis
Nov 28 17:40:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d9c78e191\x2df6d6\x2d4fbb\x2da215\x2d3abc59437ec7.mount: Deactivated successfully.
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.906 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.907 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6261a0-813d-488c-b9e1-ffb6da49570f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.908 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d191711f-581c-4ac9-8d1f-337f340a2713 in datapath 9c78e191-f6d6-4fbb-a215-3abc59437ec7 unbound from our chassis
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.909 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c78e191-f6d6-4fbb-a215-3abc59437ec7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:40:02 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:02.909 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a174962d-471f-492e-853b-cce2216f6cd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:40:03 compute-0 nova_compute[187223]: 2025-11-28 17:40:03.486 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.311 187227 DEBUG nova.compute.manager [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Received event network-vif-unplugged-d191711f-581c-4ac9-8d1f-337f340a2713 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.312 187227 DEBUG oslo_concurrency.lockutils [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.312 187227 DEBUG oslo_concurrency.lockutils [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.312 187227 DEBUG oslo_concurrency.lockutils [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.313 187227 DEBUG nova.compute.manager [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] No waiting events found dispatching network-vif-unplugged-d191711f-581c-4ac9-8d1f-337f340a2713 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.313 187227 DEBUG nova.compute.manager [req-2bc17497-1feb-43e1-9ccf-53fcf5e41a80 req-a737aa14-7efb-4178-be71-7be08050e100 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Received event network-vif-unplugged-d191711f-581c-4ac9-8d1f-337f340a2713 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.888 187227 DEBUG nova.network.neutron [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.907 187227 INFO nova.compute.manager [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Took 3.03 seconds to deallocate network for instance.
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.956 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.956 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.962 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:05 compute-0 nova_compute[187223]: 2025-11-28 17:40:05.996 187227 INFO nova.scheduler.client.report [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Deleted allocations for instance 4103b83a-8bcb-41ce-8044-ae4574ed2c4a
Nov 28 17:40:06 compute-0 nova_compute[187223]: 2025-11-28 17:40:06.105 187227 DEBUG oslo_concurrency.lockutils [None req-2fa97fdd-6185-48f7-acbb-8db6f32043da e6ed8cc17a7c4f34b32582c250e4b754 f8bcfa33baac4402a3841f550dae7748 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.454 187227 DEBUG nova.compute.manager [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Received event network-vif-plugged-d191711f-581c-4ac9-8d1f-337f340a2713 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.454 187227 DEBUG oslo_concurrency.lockutils [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.454 187227 DEBUG oslo_concurrency.lockutils [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.455 187227 DEBUG oslo_concurrency.lockutils [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4103b83a-8bcb-41ce-8044-ae4574ed2c4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.455 187227 DEBUG nova.compute.manager [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] No waiting events found dispatching network-vif-plugged-d191711f-581c-4ac9-8d1f-337f340a2713 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.455 187227 WARNING nova.compute.manager [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Received unexpected event network-vif-plugged-d191711f-581c-4ac9-8d1f-337f340a2713 for instance with vm_state deleted and task_state None.
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.455 187227 DEBUG nova.compute.manager [req-51e7e0b5-697c-4f12-a7a7-e67eddd26c0c req-e7716115-d8aa-4aef-8dfe-ca2ce6beef80 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Received event network-vif-deleted-d191711f-581c-4ac9-8d1f-337f340a2713 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:40:07 compute-0 nova_compute[187223]: 2025-11-28 17:40:07.821 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:08 compute-0 podman[212166]: 2025-11-28 17:40:08.23501872 +0000 UTC m=+0.094865239 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:40:08 compute-0 podman[212167]: 2025-11-28 17:40:08.242414075 +0000 UTC m=+0.098485504 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:40:08 compute-0 nova_compute[187223]: 2025-11-28 17:40:08.526 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:10 compute-0 nova_compute[187223]: 2025-11-28 17:40:10.143 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351595.1420786, 46d07989-a2d7-4ab0-a623-3bf99e2b2b81 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:40:10 compute-0 nova_compute[187223]: 2025-11-28 17:40:10.144 187227 INFO nova.compute.manager [-] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] VM Stopped (Lifecycle Event)
Nov 28 17:40:10 compute-0 nova_compute[187223]: 2025-11-28 17:40:10.330 187227 DEBUG nova.compute.manager [None req-c5ea2d28-c7f1-407d-a119-23f0dd7aa58d - - - - - -] [instance: 46d07989-a2d7-4ab0-a623-3bf99e2b2b81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:40:11 compute-0 podman[212214]: 2025-11-28 17:40:11.223741973 +0000 UTC m=+0.081840600 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Nov 28 17:40:12 compute-0 nova_compute[187223]: 2025-11-28 17:40:12.824 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:13 compute-0 nova_compute[187223]: 2025-11-28 17:40:13.528 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:17 compute-0 nova_compute[187223]: 2025-11-28 17:40:17.789 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351602.7866457, 4103b83a-8bcb-41ce-8044-ae4574ed2c4a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:40:17 compute-0 nova_compute[187223]: 2025-11-28 17:40:17.789 187227 INFO nova.compute.manager [-] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] VM Stopped (Lifecycle Event)
Nov 28 17:40:17 compute-0 nova_compute[187223]: 2025-11-28 17:40:17.817 187227 DEBUG nova.compute.manager [None req-62dc9af3-cefd-4cae-8dc9-cc004c0ed7db - - - - - -] [instance: 4103b83a-8bcb-41ce-8044-ae4574ed2c4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:40:17 compute-0 nova_compute[187223]: 2025-11-28 17:40:17.827 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:18 compute-0 nova_compute[187223]: 2025-11-28 17:40:18.530 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:22 compute-0 nova_compute[187223]: 2025-11-28 17:40:22.828 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:23 compute-0 nova_compute[187223]: 2025-11-28 17:40:23.533 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:27 compute-0 podman[212237]: 2025-11-28 17:40:27.230517511 +0000 UTC m=+0.075783134 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:40:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:27.687 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:27.687 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:40:27.688 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:27 compute-0 nova_compute[187223]: 2025-11-28 17:40:27.831 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:28 compute-0 nova_compute[187223]: 2025-11-28 17:40:28.534 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:29 compute-0 podman[197556]: time="2025-11-28T17:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:40:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:40:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: ERROR   17:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: ERROR   17:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: ERROR   17:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: ERROR   17:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: ERROR   17:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:40:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:31 compute-0 sshd-session[212260]: Invalid user ubuntu from 193.32.162.146 port 42478
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.724 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.724 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.725 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.725 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:40:31 compute-0 sshd-session[212260]: Connection closed by invalid user ubuntu 193.32.162.146 port 42478 [preauth]
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.901 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.902 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5878MB free_disk=73.34130859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.903 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:40:31 compute-0 nova_compute[187223]: 2025-11-28 17:40:31.903 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.011 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.011 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.041 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.070 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.103 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.104 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:40:32 compute-0 nova_compute[187223]: 2025-11-28 17:40:32.833 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:33 compute-0 nova_compute[187223]: 2025-11-28 17:40:33.103 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:33 compute-0 nova_compute[187223]: 2025-11-28 17:40:33.104 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:33 compute-0 nova_compute[187223]: 2025-11-28 17:40:33.105 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:40:33 compute-0 podman[212263]: 2025-11-28 17:40:33.211589162 +0000 UTC m=+0.067000599 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 17:40:33 compute-0 nova_compute[187223]: 2025-11-28 17:40:33.536 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:34 compute-0 nova_compute[187223]: 2025-11-28 17:40:34.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:35 compute-0 nova_compute[187223]: 2025-11-28 17:40:35.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:35 compute-0 ovn_controller[95574]: 2025-11-28T17:40:35Z|00101|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.709 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:36 compute-0 nova_compute[187223]: 2025-11-28 17:40:36.711 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:37 compute-0 nova_compute[187223]: 2025-11-28 17:40:37.868 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:38 compute-0 nova_compute[187223]: 2025-11-28 17:40:38.538 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:39 compute-0 podman[212282]: 2025-11-28 17:40:39.219006238 +0000 UTC m=+0.072776566 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 17:40:39 compute-0 podman[212283]: 2025-11-28 17:40:39.24660209 +0000 UTC m=+0.098620117 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Nov 28 17:40:39 compute-0 nova_compute[187223]: 2025-11-28 17:40:39.706 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:40:42 compute-0 podman[212325]: 2025-11-28 17:40:42.260167265 +0000 UTC m=+0.106514977 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350)
Nov 28 17:40:42 compute-0 nova_compute[187223]: 2025-11-28 17:40:42.869 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:43 compute-0 nova_compute[187223]: 2025-11-28 17:40:43.540 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:47 compute-0 nova_compute[187223]: 2025-11-28 17:40:47.871 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:48 compute-0 nova_compute[187223]: 2025-11-28 17:40:48.545 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:52 compute-0 nova_compute[187223]: 2025-11-28 17:40:52.873 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:53 compute-0 nova_compute[187223]: 2025-11-28 17:40:53.544 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:57 compute-0 nova_compute[187223]: 2025-11-28 17:40:57.888 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:58 compute-0 podman[212347]: 2025-11-28 17:40:58.208964427 +0000 UTC m=+0.069286516 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:40:58 compute-0 nova_compute[187223]: 2025-11-28 17:40:58.547 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:40:59 compute-0 podman[197556]: time="2025-11-28T17:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:40:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:40:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: ERROR   17:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: ERROR   17:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: ERROR   17:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: ERROR   17:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: ERROR   17:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:41:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:41:02 compute-0 nova_compute[187223]: 2025-11-28 17:41:02.890 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:03 compute-0 nova_compute[187223]: 2025-11-28 17:41:03.587 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:04 compute-0 podman[212373]: 2025-11-28 17:41:04.214953691 +0000 UTC m=+0.071800558 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:41:08 compute-0 nova_compute[187223]: 2025-11-28 17:41:08.134 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:08 compute-0 nova_compute[187223]: 2025-11-28 17:41:08.590 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:10 compute-0 podman[212393]: 2025-11-28 17:41:10.221636003 +0000 UTC m=+0.077703354 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:41:10 compute-0 podman[212394]: 2025-11-28 17:41:10.283160066 +0000 UTC m=+0.124275723 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:41:11 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:11.311 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:41:11 compute-0 nova_compute[187223]: 2025-11-28 17:41:11.313 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:11 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:11.313 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:41:13 compute-0 nova_compute[187223]: 2025-11-28 17:41:13.135 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:13 compute-0 podman[212437]: 2025-11-28 17:41:13.206032545 +0000 UTC m=+0.071376955 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 17:41:13 compute-0 nova_compute[187223]: 2025-11-28 17:41:13.570 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:13 compute-0 nova_compute[187223]: 2025-11-28 17:41:13.591 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:18 compute-0 nova_compute[187223]: 2025-11-28 17:41:18.138 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:18 compute-0 nova_compute[187223]: 2025-11-28 17:41:18.593 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:19.316 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:41:23 compute-0 nova_compute[187223]: 2025-11-28 17:41:23.142 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:23 compute-0 nova_compute[187223]: 2025-11-28 17:41:23.595 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:27.688 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:41:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:27.689 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:41:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:41:27.689 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:41:28 compute-0 nova_compute[187223]: 2025-11-28 17:41:28.146 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:28 compute-0 nova_compute[187223]: 2025-11-28 17:41:28.596 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:29 compute-0 podman[212460]: 2025-11-28 17:41:29.199888758 +0000 UTC m=+0.054989880 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:41:29 compute-0 podman[197556]: time="2025-11-28T17:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:41:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:41:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: ERROR   17:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: ERROR   17:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: ERROR   17:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: ERROR   17:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: ERROR   17:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:41:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:41:31 compute-0 nova_compute[187223]: 2025-11-28 17:41:31.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.709 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.954 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.955 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.34130859375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.955 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:41:32 compute-0 nova_compute[187223]: 2025-11-28 17:41:32.956 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.021 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.021 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.039 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.061 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.061 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.075 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.095 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.119 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.133 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.135 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.135 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.149 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:33 compute-0 nova_compute[187223]: 2025-11-28 17:41:33.599 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:34 compute-0 nova_compute[187223]: 2025-11-28 17:41:34.135 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:34 compute-0 nova_compute[187223]: 2025-11-28 17:41:34.135 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:41:35 compute-0 podman[212484]: 2025-11-28 17:41:35.202707825 +0000 UTC m=+0.065050385 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:41:35 compute-0 nova_compute[187223]: 2025-11-28 17:41:35.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:36 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:41:36 compute-0 nova_compute[187223]: 2025-11-28 17:41:36.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:37 compute-0 nova_compute[187223]: 2025-11-28 17:41:37.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:37 compute-0 nova_compute[187223]: 2025-11-28 17:41:37.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:41:37 compute-0 nova_compute[187223]: 2025-11-28 17:41:37.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:41:37 compute-0 nova_compute[187223]: 2025-11-28 17:41:37.702 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:41:38 compute-0 nova_compute[187223]: 2025-11-28 17:41:38.153 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:38 compute-0 nova_compute[187223]: 2025-11-28 17:41:38.601 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:38 compute-0 nova_compute[187223]: 2025-11-28 17:41:38.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:39 compute-0 nova_compute[187223]: 2025-11-28 17:41:39.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:41:41 compute-0 podman[212506]: 2025-11-28 17:41:41.199024759 +0000 UTC m=+0.062000370 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:41:41 compute-0 podman[212507]: 2025-11-28 17:41:41.253452632 +0000 UTC m=+0.108441926 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:41:43 compute-0 nova_compute[187223]: 2025-11-28 17:41:43.200 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:43 compute-0 nova_compute[187223]: 2025-11-28 17:41:43.603 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:44 compute-0 podman[212554]: 2025-11-28 17:41:44.260760693 +0000 UTC m=+0.111146792 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 28 17:41:47 compute-0 ovn_controller[95574]: 2025-11-28T17:41:47Z|00102|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 17:41:48 compute-0 nova_compute[187223]: 2025-11-28 17:41:48.203 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:48 compute-0 nova_compute[187223]: 2025-11-28 17:41:48.606 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:53 compute-0 nova_compute[187223]: 2025-11-28 17:41:53.207 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:53 compute-0 nova_compute[187223]: 2025-11-28 17:41:53.608 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:58 compute-0 nova_compute[187223]: 2025-11-28 17:41:58.210 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:58 compute-0 nova_compute[187223]: 2025-11-28 17:41:58.612 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:41:59 compute-0 podman[197556]: time="2025-11-28T17:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:41:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:41:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Nov 28 17:42:00 compute-0 podman[212575]: 2025-11-28 17:42:00.190137018 +0000 UTC m=+0.052604172 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: ERROR   17:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: ERROR   17:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: ERROR   17:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: ERROR   17:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: ERROR   17:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:42:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:42:03 compute-0 nova_compute[187223]: 2025-11-28 17:42:03.213 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:03 compute-0 nova_compute[187223]: 2025-11-28 17:42:03.613 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:06 compute-0 podman[212599]: 2025-11-28 17:42:06.217724109 +0000 UTC m=+0.083020825 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:42:08 compute-0 nova_compute[187223]: 2025-11-28 17:42:08.217 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:08 compute-0 nova_compute[187223]: 2025-11-28 17:42:08.654 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:12 compute-0 podman[212621]: 2025-11-28 17:42:12.200115567 +0000 UTC m=+0.063141091 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 17:42:12 compute-0 nova_compute[187223]: 2025-11-28 17:42:12.219 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:12.218 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:42:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:12.220 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:42:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:12.223 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:12 compute-0 podman[212622]: 2025-11-28 17:42:12.235097069 +0000 UTC m=+0.094798508 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Nov 28 17:42:13 compute-0 nova_compute[187223]: 2025-11-28 17:42:13.220 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:13 compute-0 nova_compute[187223]: 2025-11-28 17:42:13.694 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:15 compute-0 podman[212666]: 2025-11-28 17:42:15.220325825 +0000 UTC m=+0.082665375 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350)
Nov 28 17:42:18 compute-0 nova_compute[187223]: 2025-11-28 17:42:18.224 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:18 compute-0 nova_compute[187223]: 2025-11-28 17:42:18.696 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:23 compute-0 nova_compute[187223]: 2025-11-28 17:42:23.227 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:23 compute-0 nova_compute[187223]: 2025-11-28 17:42:23.697 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:27.689 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:27.690 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:27.690 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:27 compute-0 nova_compute[187223]: 2025-11-28 17:42:27.968 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:27 compute-0 nova_compute[187223]: 2025-11-28 17:42:27.969 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:27 compute-0 nova_compute[187223]: 2025-11-28 17:42:27.987 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.085 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.086 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.093 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.093 187227 INFO nova.compute.claims [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.217 187227 DEBUG nova.compute.provider_tree [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.230 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.236 187227 DEBUG nova.scheduler.client.report [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.258 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.259 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.321 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.322 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.344 187227 INFO nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.370 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.699 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.779 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.780 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.780 187227 INFO nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Creating image(s)
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.781 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.781 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.782 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.795 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.876 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.877 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.877 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.888 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.946 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.947 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.982 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.983 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:28 compute-0 nova_compute[187223]: 2025-11-28 17:42:28.983 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.042 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.043 187227 DEBUG nova.virt.disk.api [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.044 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.065 187227 DEBUG nova.policy [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.104 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.105 187227 DEBUG nova.virt.disk.api [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.105 187227 DEBUG nova.objects.instance [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.119 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.119 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Ensure instance console log exists: /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.120 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.120 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.120 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.706 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:42:29 compute-0 podman[197556]: time="2025-11-28T17:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:42:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:42:29 compute-0 nova_compute[187223]: 2025-11-28 17:42:29.750 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Successfully created port: 28f9e34a-6f32-471b-94c6-433b669475ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:42:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.632 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Successfully updated port: 28f9e34a-6f32-471b-94c6-433b669475ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.659 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.659 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.659 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.730 187227 DEBUG nova.compute.manager [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-changed-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.730 187227 DEBUG nova.compute.manager [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Refreshing instance network info cache due to event network-changed-28f9e34a-6f32-471b-94c6-433b669475ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.731 187227 DEBUG oslo_concurrency.lockutils [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:42:30 compute-0 nova_compute[187223]: 2025-11-28 17:42:30.796 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:42:31 compute-0 podman[212703]: 2025-11-28 17:42:31.22735769 +0000 UTC m=+0.069076539 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: ERROR   17:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: ERROR   17:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: ERROR   17:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: ERROR   17:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: ERROR   17:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:42:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.645 187227 DEBUG nova.network.neutron [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating instance_info_cache with network_info: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.671 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.672 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Instance network_info: |[{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.672 187227 DEBUG oslo_concurrency.lockutils [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.673 187227 DEBUG nova.network.neutron [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Refreshing network info cache for port 28f9e34a-6f32-471b-94c6-433b669475ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.676 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Start _get_guest_xml network_info=[{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.682 187227 WARNING nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.687 187227 DEBUG nova.virt.libvirt.host [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.688 187227 DEBUG nova.virt.libvirt.host [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.696 187227 DEBUG nova.virt.libvirt.host [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.697 187227 DEBUG nova.virt.libvirt.host [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.699 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.699 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.700 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.701 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.701 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.702 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.702 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.703 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.703 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.704 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.704 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.705 187227 DEBUG nova.virt.hardware [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.712 187227 DEBUG nova.virt.libvirt.vif [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-550534645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-550534645',id=11,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-4u2oai40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:42:28Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=34d2d5a1-837e-49dc-a047-618a9ed35dd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.713 187227 DEBUG nova.network.os_vif_util [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.714 187227 DEBUG nova.network.os_vif_util [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.716 187227 DEBUG nova.objects.instance [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.732 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <uuid>34d2d5a1-837e-49dc-a047-618a9ed35dd1</uuid>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <name>instance-0000000b</name>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-550534645</nova:name>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:42:31</nova:creationTime>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         <nova:port uuid="28f9e34a-6f32-471b-94c6-433b669475ca">
Nov 28 17:42:31 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <system>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="serial">34d2d5a1-837e-49dc-a047-618a9ed35dd1</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="uuid">34d2d5a1-837e-49dc-a047-618a9ed35dd1</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </system>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <os>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </os>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <features>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </features>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.config"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:b7:22:8e"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <target dev="tap28f9e34a-6f"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/console.log" append="off"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <video>
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </video>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:42:31 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:42:31 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:42:31 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:42:31 compute-0 nova_compute[187223]: </domain>
Nov 28 17:42:31 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.734 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Preparing to wait for external event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.735 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.736 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.736 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.738 187227 DEBUG nova.virt.libvirt.vif [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-550534645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-550534645',id=11,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-4u2oai40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:42:28Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=34d2d5a1-837e-49dc-a047-618a9ed35dd1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.738 187227 DEBUG nova.network.os_vif_util [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.740 187227 DEBUG nova.network.os_vif_util [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.740 187227 DEBUG os_vif [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.741 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.742 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.743 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.747 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.748 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28f9e34a-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.749 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap28f9e34a-6f, col_values=(('external_ids', {'iface-id': '28f9e34a-6f32-471b-94c6-433b669475ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:22:8e', 'vm-uuid': '34d2d5a1-837e-49dc-a047-618a9ed35dd1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.752 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.754 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:42:31 compute-0 NetworkManager[55763]: <info>  [1764351751.7553] manager: (tap28f9e34a-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.760 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.762 187227 INFO os_vif [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f')
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.818 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.819 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.819 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:b7:22:8e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:42:31 compute-0 nova_compute[187223]: 2025-11-28 17:42:31.820 187227 INFO nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Using config drive
Nov 28 17:42:32 compute-0 nova_compute[187223]: 2025-11-28 17:42:32.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.702 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.713 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.713 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.714 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.714 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.778 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.846 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.847 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.912 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:33 compute-0 nova_compute[187223]: 2025-11-28 17:42:33.914 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000000b, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.config'
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.092 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.093 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5885MB free_disk=73.3410873413086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.255 187227 INFO nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Creating config drive at /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.config
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.260 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxl29n67i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.287 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 34d2d5a1-837e-49dc-a047-618a9ed35dd1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.288 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.288 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.399 187227 DEBUG oslo_concurrency.processutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxl29n67i" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.487 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:42:34 compute-0 kernel: tap28f9e34a-6f: entered promiscuous mode
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.5018] manager: (tap28f9e34a-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Nov 28 17:42:34 compute-0 ovn_controller[95574]: 2025-11-28T17:42:34Z|00103|binding|INFO|Claiming lport 28f9e34a-6f32-471b-94c6-433b669475ca for this chassis.
Nov 28 17:42:34 compute-0 ovn_controller[95574]: 2025-11-28T17:42:34Z|00104|binding|INFO|28f9e34a-6f32-471b-94c6-433b669475ca: Claiming fa:16:3e:b7:22:8e 10.100.0.12
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.534 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.536 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.541 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.555 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:22:8e 10.100.0.12'], port_security=['fa:16:3e:b7:22:8e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '34d2d5a1-837e-49dc-a047-618a9ed35dd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=28f9e34a-6f32-471b-94c6-433b669475ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.557 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 28f9e34a-6f32-471b-94c6-433b669475ca in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.558 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:42:34 compute-0 systemd-udevd[212752]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.567 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.567 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:34 compute-0 systemd-machined[153517]: New machine qemu-9-instance-0000000b.
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.571 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5e351bb6-84e9-4e0c-8466-4390cb637216]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.572 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.574 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.575 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5c6610-1d1b-48e9-ba86-39d18a170629]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.575 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bc565219-4d0f-477d-8912-64434e452088]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.5845] device (tap28f9e34a-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.5860] device (tap28f9e34a-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.589 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[44f75b7c-f86c-40b6-8a88-6cfeeaae2c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_controller[95574]: 2025-11-28T17:42:34Z|00105|binding|INFO|Setting lport 28f9e34a-6f32-471b-94c6-433b669475ca ovn-installed in OVS
Nov 28 17:42:34 compute-0 ovn_controller[95574]: 2025-11-28T17:42:34Z|00106|binding|INFO|Setting lport 28f9e34a-6f32-471b-94c6-433b669475ca up in Southbound
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.597 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000b.
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.617 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[070c2d93-730a-49f1-af9f-342795dbd3f1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.653 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2aeefc-269f-4446-b0e4-1c2fa2a07b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 systemd-udevd[212757]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.661 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7637e2-f47d-4839-a529-057f222b300b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.6634] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.692 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[450d3ca9-d844-41a3-a767-3fa02d53ece8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.696 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d8345247-4c8a-4d06-b358-3c16a7df407d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.7211] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.725 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf55cfb-479f-416e-9673-363b037acd36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.747 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ada89c67-4705-4adc-9cdc-508dcbb14d5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492935, 'reachable_time': 32831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212786, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.767 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[74814d75-3753-4da3-992f-46a3601e7254]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492935, 'tstamp': 492935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212787, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.787 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0702bb45-9c83-4801-92aa-a70c80b18420]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492935, 'reachable_time': 32831, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212788, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.836 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2a6ac7-8ca8-4c65-b141-45d7b63f6a2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.921 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1164af7a-b5ff-49e1-a209-6a883d84351f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.925 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.926 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351754.9251552, 34d2d5a1-837e-49dc-a047-618a9ed35dd1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.926 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.926 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] VM Started (Lifecycle Event)
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.927 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.929 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 NetworkManager[55763]: <info>  [1764351754.9302] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 28 17:42:34 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.933 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:42:34 compute-0 ovn_controller[95574]: 2025-11-28T17:42:34Z|00107|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.934 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.947 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.947 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.948 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.948 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a3dd07-31d1-4b6e-aef1-7aefd4790b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.949 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:42:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:42:34.951 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.953 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351754.9264584, 34d2d5a1-837e-49dc-a047-618a9ed35dd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.953 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] VM Paused (Lifecycle Event)
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.970 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.974 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:42:34 compute-0 nova_compute[187223]: 2025-11-28 17:42:34.995 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:42:35 compute-0 podman[212827]: 2025-11-28 17:42:35.341327814 +0000 UTC m=+0.061201906 container create ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 17:42:35 compute-0 systemd[1]: Started libpod-conmon-ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065.scope.
Nov 28 17:42:35 compute-0 podman[212827]: 2025-11-28 17:42:35.305407556 +0000 UTC m=+0.025281678 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:42:35 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:42:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a9122909d82b13e47b2350160b572e73ce2c555a2301780bdea85e23ed44509/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:42:35 compute-0 podman[212827]: 2025-11-28 17:42:35.464548417 +0000 UTC m=+0.184422529 container init ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 17:42:35 compute-0 podman[212827]: 2025-11-28 17:42:35.471803403 +0000 UTC m=+0.191677495 container start ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 17:42:35 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [NOTICE]   (212846) : New worker (212848) forked
Nov 28 17:42:35 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [NOTICE]   (212846) : Loading success.
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.546 187227 DEBUG nova.compute.manager [req-e35f1671-ed60-4f94-9d7e-3627e35d0586 req-070a24fd-c381-4e10-abbb-eaa477dda591 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.547 187227 DEBUG oslo_concurrency.lockutils [req-e35f1671-ed60-4f94-9d7e-3627e35d0586 req-070a24fd-c381-4e10-abbb-eaa477dda591 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.547 187227 DEBUG oslo_concurrency.lockutils [req-e35f1671-ed60-4f94-9d7e-3627e35d0586 req-070a24fd-c381-4e10-abbb-eaa477dda591 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.547 187227 DEBUG oslo_concurrency.lockutils [req-e35f1671-ed60-4f94-9d7e-3627e35d0586 req-070a24fd-c381-4e10-abbb-eaa477dda591 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.547 187227 DEBUG nova.compute.manager [req-e35f1671-ed60-4f94-9d7e-3627e35d0586 req-070a24fd-c381-4e10-abbb-eaa477dda591 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Processing event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.548 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.552 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351755.552628, 34d2d5a1-837e-49dc-a047-618a9ed35dd1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.553 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] VM Resumed (Lifecycle Event)
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.555 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.560 187227 INFO nova.virt.libvirt.driver [-] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Instance spawned successfully.
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.561 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.562 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.572 187227 DEBUG nova.network.neutron [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updated VIF entry in instance network info cache for port 28f9e34a-6f32-471b-94c6-433b669475ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.573 187227 DEBUG nova.network.neutron [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating instance_info_cache with network_info: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.593 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.603 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.606 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.606 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.607 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.607 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.607 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.608 187227 DEBUG nova.virt.libvirt.driver [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.665 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.677 187227 DEBUG oslo_concurrency.lockutils [req-d7c057f0-af3c-421c-8c66-98efe979f232 req-9a12eb40-4a5e-4ab3-9ff5-b737fd72f8f1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.695 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.701 187227 INFO nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Took 6.92 seconds to spawn the instance on the hypervisor.
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.702 187227 DEBUG nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.755 187227 INFO nova.compute.manager [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Took 7.72 seconds to build instance.
Nov 28 17:42:35 compute-0 nova_compute[187223]: 2025-11-28 17:42:35.771 187227 DEBUG oslo_concurrency.lockutils [None req-c331bcbd-8889-4e2d-9791-5ef74b2a517b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:36 compute-0 nova_compute[187223]: 2025-11-28 17:42:36.796 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:37 compute-0 podman[212857]: 2025-11-28 17:42:37.20342366 +0000 UTC m=+0.057134321 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.623 187227 DEBUG nova.compute.manager [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.623 187227 DEBUG oslo_concurrency.lockutils [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.623 187227 DEBUG oslo_concurrency.lockutils [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.624 187227 DEBUG oslo_concurrency.lockutils [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.624 187227 DEBUG nova.compute.manager [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.624 187227 WARNING nova.compute.manager [req-a6b9be2d-1e17-43e9-841c-c1a35c2ed058 req-086670d2-1962-4e85-ade8-82e6f16eb81a 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state None.
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:42:37 compute-0 nova_compute[187223]: 2025-11-28 17:42:37.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:42:38 compute-0 nova_compute[187223]: 2025-11-28 17:42:38.248 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:42:38 compute-0 nova_compute[187223]: 2025-11-28 17:42:38.249 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:42:38 compute-0 nova_compute[187223]: 2025-11-28 17:42:38.249 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:42:38 compute-0 nova_compute[187223]: 2025-11-28 17:42:38.250 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:42:38 compute-0 nova_compute[187223]: 2025-11-28 17:42:38.706 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:39 compute-0 nova_compute[187223]: 2025-11-28 17:42:39.529 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating instance_info_cache with network_info: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:42:39 compute-0 nova_compute[187223]: 2025-11-28 17:42:39.636 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:42:39 compute-0 nova_compute[187223]: 2025-11-28 17:42:39.637 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:42:39 compute-0 nova_compute[187223]: 2025-11-28 17:42:39.637 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:39 compute-0 nova_compute[187223]: 2025-11-28 17:42:39.638 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:40 compute-0 nova_compute[187223]: 2025-11-28 17:42:40.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:40 compute-0 nova_compute[187223]: 2025-11-28 17:42:40.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:41 compute-0 nova_compute[187223]: 2025-11-28 17:42:41.798 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:42 compute-0 nova_compute[187223]: 2025-11-28 17:42:42.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:43 compute-0 podman[212876]: 2025-11-28 17:42:43.205191786 +0000 UTC m=+0.062805061 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 17:42:43 compute-0 podman[212877]: 2025-11-28 17:42:43.238534732 +0000 UTC m=+0.092334199 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:42:43 compute-0 nova_compute[187223]: 2025-11-28 17:42:43.706 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:45 compute-0 nova_compute[187223]: 2025-11-28 17:42:45.699 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:45 compute-0 nova_compute[187223]: 2025-11-28 17:42:45.700 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:42:46 compute-0 podman[212922]: 2025-11-28 17:42:46.207238008 +0000 UTC m=+0.063998865 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 17:42:46 compute-0 nova_compute[187223]: 2025-11-28 17:42:46.801 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:47 compute-0 ovn_controller[95574]: 2025-11-28T17:42:47Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:22:8e 10.100.0.12
Nov 28 17:42:47 compute-0 ovn_controller[95574]: 2025-11-28T17:42:47Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:22:8e 10.100.0.12
Nov 28 17:42:48 compute-0 nova_compute[187223]: 2025-11-28 17:42:48.709 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:50 compute-0 nova_compute[187223]: 2025-11-28 17:42:50.500 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:42:50 compute-0 nova_compute[187223]: 2025-11-28 17:42:50.527 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Triggering sync for uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Nov 28 17:42:50 compute-0 nova_compute[187223]: 2025-11-28 17:42:50.528 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:42:50 compute-0 nova_compute[187223]: 2025-11-28 17:42:50.528 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:42:50 compute-0 nova_compute[187223]: 2025-11-28 17:42:50.561 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:42:51 compute-0 nova_compute[187223]: 2025-11-28 17:42:51.804 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:53 compute-0 nova_compute[187223]: 2025-11-28 17:42:53.711 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:56 compute-0 nova_compute[187223]: 2025-11-28 17:42:56.807 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:58 compute-0 nova_compute[187223]: 2025-11-28 17:42:58.713 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:42:59 compute-0 podman[197556]: time="2025-11-28T17:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:42:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18322 "" "Go-http-client/1.1"
Nov 28 17:42:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: ERROR   17:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: ERROR   17:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: ERROR   17:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: ERROR   17:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: ERROR   17:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:43:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:43:01 compute-0 nova_compute[187223]: 2025-11-28 17:43:01.809 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:02 compute-0 podman[212952]: 2025-11-28 17:43:02.209966231 +0000 UTC m=+0.056901504 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:43:03 compute-0 nova_compute[187223]: 2025-11-28 17:43:03.715 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:06 compute-0 nova_compute[187223]: 2025-11-28 17:43:06.811 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:08 compute-0 podman[212977]: 2025-11-28 17:43:08.191026611 +0000 UTC m=+0.054932418 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:43:08 compute-0 nova_compute[187223]: 2025-11-28 17:43:08.718 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:11 compute-0 nova_compute[187223]: 2025-11-28 17:43:11.814 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:13 compute-0 nova_compute[187223]: 2025-11-28 17:43:13.720 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:14 compute-0 podman[212996]: 2025-11-28 17:43:14.200746123 +0000 UTC m=+0.061616157 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Nov 28 17:43:14 compute-0 podman[212997]: 2025-11-28 17:43:14.239420699 +0000 UTC m=+0.097471053 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:43:16 compute-0 nova_compute[187223]: 2025-11-28 17:43:16.816 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:17 compute-0 podman[213042]: 2025-11-28 17:43:17.205602084 +0000 UTC m=+0.066019392 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 17:43:18 compute-0 ovn_controller[95574]: 2025-11-28T17:43:18Z|00108|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:43:18 compute-0 nova_compute[187223]: 2025-11-28 17:43:18.724 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:19 compute-0 nova_compute[187223]: 2025-11-28 17:43:19.046 187227 DEBUG nova.compute.manager [None req-a147cba3-b1ee-4534-ab57-7296d5214c56 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 28 17:43:19 compute-0 nova_compute[187223]: 2025-11-28 17:43:19.276 187227 DEBUG nova.compute.provider_tree [None req-a147cba3-b1ee-4534-ab57-7296d5214c56 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 11 to 13 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:43:21 compute-0 nova_compute[187223]: 2025-11-28 17:43:21.820 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:23 compute-0 nova_compute[187223]: 2025-11-28 17:43:23.726 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.022 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Check if temp file /var/lib/nova/instances/tmpfiemf_bz exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.023 187227 DEBUG nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfiemf_bz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='34d2d5a1-837e-49dc-a047-618a9ed35dd1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.720 187227 DEBUG oslo_concurrency.processutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.795 187227 DEBUG oslo_concurrency.processutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.797 187227 DEBUG oslo_concurrency.processutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:43:24 compute-0 nova_compute[187223]: 2025-11-28 17:43:24.860 187227 DEBUG oslo_concurrency.processutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:43:26 compute-0 nova_compute[187223]: 2025-11-28 17:43:26.824 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:27.691 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:27.693 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:27.693 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:28 compute-0 nova_compute[187223]: 2025-11-28 17:43:28.728 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:29 compute-0 sshd-session[213069]: Accepted publickey for nova from 192.168.122.101 port 47594 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:43:29 compute-0 systemd-logind[788]: New session 33 of user nova.
Nov 28 17:43:29 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:43:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:43:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:43:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:43:29 compute-0 systemd[213073]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:43:29 compute-0 systemd[213073]: Queued start job for default target Main User Target.
Nov 28 17:43:29 compute-0 systemd[213073]: Created slice User Application Slice.
Nov 28 17:43:29 compute-0 systemd[213073]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:43:29 compute-0 systemd[213073]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:43:29 compute-0 systemd[213073]: Reached target Paths.
Nov 28 17:43:29 compute-0 systemd[213073]: Reached target Timers.
Nov 28 17:43:29 compute-0 systemd[213073]: Starting D-Bus User Message Bus Socket...
Nov 28 17:43:29 compute-0 systemd[213073]: Starting Create User's Volatile Files and Directories...
Nov 28 17:43:29 compute-0 systemd[213073]: Finished Create User's Volatile Files and Directories.
Nov 28 17:43:29 compute-0 systemd[213073]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:43:29 compute-0 systemd[213073]: Reached target Sockets.
Nov 28 17:43:29 compute-0 systemd[213073]: Reached target Basic System.
Nov 28 17:43:29 compute-0 systemd[213073]: Reached target Main User Target.
Nov 28 17:43:29 compute-0 systemd[213073]: Startup finished in 146ms.
Nov 28 17:43:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:43:29 compute-0 systemd[1]: Started Session 33 of User nova.
Nov 28 17:43:29 compute-0 sshd-session[213069]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:43:29 compute-0 podman[197556]: time="2025-11-28T17:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:43:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18322 "" "Go-http-client/1.1"
Nov 28 17:43:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Nov 28 17:43:29 compute-0 sshd-session[213088]: Received disconnect from 192.168.122.101 port 47594:11: disconnected by user
Nov 28 17:43:29 compute-0 sshd-session[213088]: Disconnected from user nova 192.168.122.101 port 47594
Nov 28 17:43:29 compute-0 sshd-session[213069]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:43:29 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Nov 28 17:43:29 compute-0 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Nov 28 17:43:29 compute-0 systemd-logind[788]: Removed session 33.
Nov 28 17:43:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:30.793 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.795 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:30.797 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.865 187227 DEBUG nova.compute.manager [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.865 187227 DEBUG oslo_concurrency.lockutils [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.866 187227 DEBUG oslo_concurrency.lockutils [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.866 187227 DEBUG oslo_concurrency.lockutils [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.867 187227 DEBUG nova.compute.manager [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:30 compute-0 nova_compute[187223]: 2025-11-28 17:43:30.867 187227 DEBUG nova.compute.manager [req-077e88c3-4781-44ab-807b-c4354eb31369 req-5a31678c-62db-4bfc-86fb-fabfcb5e175b 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: ERROR   17:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: ERROR   17:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: ERROR   17:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: ERROR   17:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: ERROR   17:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:43:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.552 187227 INFO nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Took 6.69 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.552 187227 DEBUG nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.567 187227 DEBUG nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfiemf_bz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='34d2d5a1-837e-49dc-a047-618a9ed35dd1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ded7c2fa-0f3f-4615-9d88-b64ba53e90b7),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.591 187227 DEBUG nova.objects.instance [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.592 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.594 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.595 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.609 187227 DEBUG nova.virt.libvirt.vif [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-550534645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-550534645',id=11,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:42:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-4u2oai40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:42:35Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=34d2d5a1-837e-49dc-a047-618a9ed35dd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.610 187227 DEBUG nova.network.os_vif_util [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.611 187227 DEBUG nova.network.os_vif_util [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.612 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 17:43:31 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:b7:22:8e"/>
Nov 28 17:43:31 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 17:43:31 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:43:31 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 17:43:31 compute-0 nova_compute[187223]:   <target dev="tap28f9e34a-6f"/>
Nov 28 17:43:31 compute-0 nova_compute[187223]: </interface>
Nov 28 17:43:31 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.613 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 17:43:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:31.800 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:43:31 compute-0 nova_compute[187223]: 2025-11-28 17:43:31.826 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.099 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.100 187227 INFO nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.184 187227 INFO nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.688 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.688 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.944 187227 DEBUG nova.compute.manager [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.944 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.945 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.945 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.945 187227 DEBUG nova.compute.manager [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.945 187227 WARNING nova.compute.manager [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state migrating.
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.946 187227 DEBUG nova.compute.manager [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-changed-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.946 187227 DEBUG nova.compute.manager [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Refreshing instance network info cache due to event network-changed-28f9e34a-6f32-471b-94c6-433b669475ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.946 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.946 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:43:32 compute-0 nova_compute[187223]: 2025-11-28 17:43:32.947 187227 DEBUG nova.network.neutron [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Refreshing network info cache for port 28f9e34a-6f32-471b-94c6-433b669475ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.190 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.191 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:43:33 compute-0 podman[213091]: 2025-11-28 17:43:33.216306847 +0000 UTC m=+0.073216398 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.694 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.695 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:43:33 compute-0 nova_compute[187223]: 2025-11-28 17:43:33.730 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.200 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.201 187227 DEBUG nova.virt.libvirt.migration [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.318 187227 DEBUG nova.network.neutron [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updated VIF entry in instance network info cache for port 28f9e34a-6f32-471b-94c6-433b669475ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.319 187227 DEBUG nova.network.neutron [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating instance_info_cache with network_info: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.346 187227 DEBUG oslo_concurrency.lockutils [req-692337c4-dffc-4123-a864-f16dce196896 req-ce0ee173-ff6f-4d20-b6c8-cfda1da6d8ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.523 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351814.5229652, 34d2d5a1-837e-49dc-a047-618a9ed35dd1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.524 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] VM Paused (Lifecycle Event)
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.551 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.557 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.581 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 17:43:34 compute-0 kernel: tap28f9e34a-6f (unregistering): left promiscuous mode
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:34 compute-0 NetworkManager[55763]: <info>  [1764351814.6889] device (tap28f9e34a-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:43:34 compute-0 ovn_controller[95574]: 2025-11-28T17:43:34Z|00109|binding|INFO|Releasing lport 28f9e34a-6f32-471b-94c6-433b669475ca from this chassis (sb_readonly=0)
Nov 28 17:43:34 compute-0 ovn_controller[95574]: 2025-11-28T17:43:34Z|00110|binding|INFO|Setting lport 28f9e34a-6f32-471b-94c6-433b669475ca down in Southbound
Nov 28 17:43:34 compute-0 ovn_controller[95574]: 2025-11-28T17:43:34Z|00111|binding|INFO|Removing iface tap28f9e34a-6f ovn-installed in OVS
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.698 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:34.707 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:22:8e 10.100.0.12'], port_security=['fa:16:3e:b7:22:8e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '34d2d5a1-837e-49dc-a047-618a9ed35dd1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=28f9e34a-6f32-471b-94c6-433b669475ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:43:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:34.710 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 28f9e34a-6f32-471b-94c6-433b669475ca in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:43:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:34.713 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:43:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:34.716 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[27856bc7-3982-432d-bf4c-a62601f2f0ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:34.717 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.718 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:34 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 28 17:43:34 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Consumed 15.302s CPU time.
Nov 28 17:43:34 compute-0 systemd-machined[153517]: Machine qemu-9-instance-0000000b terminated.
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [NOTICE]   (212846) : haproxy version is 2.8.14-c23fe91
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [NOTICE]   (212846) : path to executable is /usr/sbin/haproxy
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [WARNING]  (212846) : Exiting Master process...
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [WARNING]  (212846) : Exiting Master process...
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [ALERT]    (212846) : Current worker (212848) exited with code 143 (Terminated)
Nov 28 17:43:34 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[212842]: [WARNING]  (212846) : All workers exited. Exiting... (0)
Nov 28 17:43:34 compute-0 systemd[1]: libpod-ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065.scope: Deactivated successfully.
Nov 28 17:43:34 compute-0 podman[213158]: 2025-11-28 17:43:34.890836184 +0000 UTC m=+0.058177116 container died ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.894 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.901 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.906 187227 DEBUG nova.compute.manager [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.907 187227 DEBUG oslo_concurrency.lockutils [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.907 187227 DEBUG oslo_concurrency.lockutils [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.907 187227 DEBUG oslo_concurrency.lockutils [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.907 187227 DEBUG nova.compute.manager [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.908 187227 DEBUG nova.compute.manager [req-0d8625ff-aeb4-47af-93fa-dccf0fdaf915 req-8b1fed8a-9449-4d9d-ba8a-2c3a4df4c105 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:43:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065-userdata-shm.mount: Deactivated successfully.
Nov 28 17:43:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a9122909d82b13e47b2350160b572e73ce2c555a2301780bdea85e23ed44509-merged.mount: Deactivated successfully.
Nov 28 17:43:34 compute-0 podman[213158]: 2025-11-28 17:43:34.945632126 +0000 UTC m=+0.112973038 container cleanup ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.946 187227 DEBUG nova.virt.libvirt.guest [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.948 187227 INFO nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migration operation has completed
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.948 187227 INFO nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] _post_live_migration() is started..
Nov 28 17:43:34 compute-0 systemd[1]: libpod-conmon-ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065.scope: Deactivated successfully.
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.953 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.954 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 17:43:34 compute-0 nova_compute[187223]: 2025-11-28 17:43:34.954 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 17:43:35 compute-0 podman[213201]: 2025-11-28 17:43:35.016081337 +0000 UTC m=+0.044585433 container remove ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.021 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1e83c1-3ad3-44d3-962e-e4657ec26442]: (4, ('Fri Nov 28 05:43:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065)\nccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065\nFri Nov 28 05:43:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (ccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065)\nccdf75290551b69d8523f144a204fe580de4d7b5685b13a3e41af05ad464d065\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.023 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fa49a1-3ee7-49b7-91a6-78525572238c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.024 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.027 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:35 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.047 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.051 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[dea03e4f-daba-40a4-8ae4-3c6e853bdb72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.069 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fa3d8f-87fa-42ec-a596-669027211943]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.071 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[15675f11-91aa-4c93-87b9-53daf8b963d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.091 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a8930b32-9617-49df-971b-eb70ab957ef6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492927, 'reachable_time': 44888, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213219, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.096 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:43:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:43:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:43:35.097 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a9ad4b-f1d8-4e89-aa19-1bec4987d593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.482 187227 DEBUG nova.compute.manager [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.482 187227 DEBUG oslo_concurrency.lockutils [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.482 187227 DEBUG oslo_concurrency.lockutils [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.482 187227 DEBUG oslo_concurrency.lockutils [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.482 187227 DEBUG nova.compute.manager [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.483 187227 DEBUG nova.compute.manager [req-113099e2-5c39-4b67-a2a7-dd73d940d8d9 req-4bc3bb58-cbc1-4eb5-9307-268ce949295f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-unplugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.626 187227 DEBUG nova.network.neutron [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Activated binding for port 28f9e34a-6f32-471b-94c6-433b669475ca and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.627 187227 DEBUG nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.628 187227 DEBUG nova.virt.libvirt.vif [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-550534645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-550534645',id=11,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:42:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-4u2oai40',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:43:21Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=34d2d5a1-837e-49dc-a047-618a9ed35dd1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.628 187227 DEBUG nova.network.os_vif_util [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.629 187227 DEBUG nova.network.os_vif_util [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.629 187227 DEBUG os_vif [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.632 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.632 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28f9e34a-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.634 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.636 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.639 187227 INFO os_vif [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:22:8e,bridge_name='br-int',has_traffic_filtering=True,id=28f9e34a-6f32-471b-94c6-433b669475ca,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28f9e34a-6f')
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.640 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.640 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.640 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.640 187227 DEBUG nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.641 187227 INFO nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Deleting instance files /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1_del
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.642 187227 INFO nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Deletion of /var/lib/nova/instances/34d2d5a1-837e-49dc-a047-618a9ed35dd1_del complete
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.705 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.706 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.706 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.706 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.874 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.876 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5860MB free_disk=73.34130096435547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.876 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.876 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.941 187227 INFO nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating resource usage from migration ded7c2fa-0f3f-4615-9d88-b64ba53e90b7
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.972 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Migration ded7c2fa-0f3f-4615-9d88-b64ba53e90b7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.972 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:43:35 compute-0 nova_compute[187223]: 2025-11-28 17:43:35.973 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.029 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.052 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.111 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.111 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.984 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.984 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.984 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.984 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.985 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.985 187227 WARNING nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state migrating.
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.985 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.985 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.985 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 WARNING nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state migrating.
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.986 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.987 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.987 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.987 187227 WARNING nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state migrating.
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.987 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.987 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.988 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.988 187227 DEBUG oslo_concurrency.lockutils [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.988 187227 DEBUG nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] No waiting events found dispatching network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:43:36 compute-0 nova_compute[187223]: 2025-11-28 17:43:36.988 187227 WARNING nova.compute.manager [req-76197762-1028-4dfd-a761-533faab04acc req-74269f7a-9ff2-4576-b887-63333f7dfed0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Received unexpected event network-vif-plugged-28f9e34a-6f32-471b-94c6-433b669475ca for instance with vm_state active and task_state migrating.
Nov 28 17:43:37 compute-0 nova_compute[187223]: 2025-11-28 17:43:37.112 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:38 compute-0 nova_compute[187223]: 2025-11-28 17:43:38.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:38 compute-0 nova_compute[187223]: 2025-11-28 17:43:38.734 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:39 compute-0 podman[213221]: 2025-11-28 17:43:39.235246835 +0000 UTC m=+0.079623224 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.704 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.704 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.704 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:43:39 compute-0 nova_compute[187223]: 2025-11-28 17:43:39.704 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 34d2d5a1-837e-49dc-a047-618a9ed35dd1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:43:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:43:39 compute-0 systemd[213073]: Activating special unit Exit the Session...
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped target Main User Target.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped target Basic System.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped target Paths.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped target Sockets.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped target Timers.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:43:39 compute-0 systemd[213073]: Closed D-Bus User Message Bus Socket.
Nov 28 17:43:39 compute-0 systemd[213073]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:43:39 compute-0 systemd[213073]: Removed slice User Application Slice.
Nov 28 17:43:39 compute-0 systemd[213073]: Reached target Shutdown.
Nov 28 17:43:39 compute-0 systemd[213073]: Finished Exit the Session.
Nov 28 17:43:39 compute-0 systemd[213073]: Reached target Exit the Session.
Nov 28 17:43:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:43:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:43:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:43:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:43:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:43:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:43:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:43:40 compute-0 nova_compute[187223]: 2025-11-28 17:43:40.635 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:40 compute-0 sshd-session[213241]: Invalid user validator from 193.32.162.146 port 33324
Nov 28 17:43:41 compute-0 sshd-session[213241]: Connection closed by invalid user validator 193.32.162.146 port 33324 [preauth]
Nov 28 17:43:42 compute-0 nova_compute[187223]: 2025-11-28 17:43:42.804 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updating instance_info_cache with network_info: [{"id": "28f9e34a-6f32-471b-94c6-433b669475ca", "address": "fa:16:3e:b7:22:8e", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28f9e34a-6f", "ovs_interfaceid": "28f9e34a-6f32-471b-94c6-433b669475ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:43:42 compute-0 nova_compute[187223]: 2025-11-28 17:43:42.832 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-34d2d5a1-837e-49dc-a047-618a9ed35dd1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:43:42 compute-0 nova_compute[187223]: 2025-11-28 17:43:42.832 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:43:42 compute-0 nova_compute[187223]: 2025-11-28 17:43:42.833 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:42 compute-0 nova_compute[187223]: 2025-11-28 17:43:42.833 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.383 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.384 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.384 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "34d2d5a1-837e-49dc-a047-618a9ed35dd1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.409 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.410 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.410 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.410 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.566 187227 WARNING nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.567 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.34130096435547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.568 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.568 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.603 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration for instance 34d2d5a1-837e-49dc-a047-618a9ed35dd1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.623 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.664 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration ded7c2fa-0f3f-4615-9d88-b64ba53e90b7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.664 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.665 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.710 187227 DEBUG nova.compute.provider_tree [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.724 187227 DEBUG nova.scheduler.client.report [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.735 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.745 187227 DEBUG nova.compute.resource_tracker [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.745 187227 DEBUG oslo_concurrency.lockutils [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.754 187227 INFO nova.compute.manager [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.861 187227 INFO nova.scheduler.client.report [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Deleted allocation for migration ded7c2fa-0f3f-4615-9d88-b64ba53e90b7
Nov 28 17:43:43 compute-0 nova_compute[187223]: 2025-11-28 17:43:43.862 187227 DEBUG nova.virt.libvirt.driver [None req-8d1d964b-8af6-4f6d-87b2-e4c87d2ca811 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 17:43:45 compute-0 podman[213245]: 2025-11-28 17:43:45.235962174 +0000 UTC m=+0.089858675 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 17:43:45 compute-0 podman[213244]: 2025-11-28 17:43:45.235928553 +0000 UTC m=+0.090002058 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:43:45 compute-0 nova_compute[187223]: 2025-11-28 17:43:45.639 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:45 compute-0 nova_compute[187223]: 2025-11-28 17:43:45.827 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:43:48 compute-0 podman[213289]: 2025-11-28 17:43:48.197040451 +0000 UTC m=+0.057371574 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 28 17:43:48 compute-0 nova_compute[187223]: 2025-11-28 17:43:48.738 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:49 compute-0 nova_compute[187223]: 2025-11-28 17:43:49.947 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351814.945238, 34d2d5a1-837e-49dc-a047-618a9ed35dd1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:43:49 compute-0 nova_compute[187223]: 2025-11-28 17:43:49.947 187227 INFO nova.compute.manager [-] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] VM Stopped (Lifecycle Event)
Nov 28 17:43:49 compute-0 nova_compute[187223]: 2025-11-28 17:43:49.980 187227 DEBUG nova.compute.manager [None req-104e9ad1-c23e-4e64-acf5-fa5490c09e2a - - - - - -] [instance: 34d2d5a1-837e-49dc-a047-618a9ed35dd1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:43:50 compute-0 nova_compute[187223]: 2025-11-28 17:43:50.642 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:53 compute-0 nova_compute[187223]: 2025-11-28 17:43:53.740 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:55 compute-0 nova_compute[187223]: 2025-11-28 17:43:55.644 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:58 compute-0 nova_compute[187223]: 2025-11-28 17:43:58.742 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:43:59 compute-0 podman[197556]: time="2025-11-28T17:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:43:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:43:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 28 17:44:00 compute-0 nova_compute[187223]: 2025-11-28 17:44:00.646 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: ERROR   17:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: ERROR   17:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: ERROR   17:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: ERROR   17:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: ERROR   17:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:44:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:44:03 compute-0 nova_compute[187223]: 2025-11-28 17:44:03.744 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:04 compute-0 podman[213311]: 2025-11-28 17:44:04.24120183 +0000 UTC m=+0.104922748 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:44:05 compute-0 nova_compute[187223]: 2025-11-28 17:44:05.649 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:07 compute-0 nova_compute[187223]: 2025-11-28 17:44:07.296 187227 DEBUG nova.compute.manager [None req-c841d8b5-dff3-43a4-8168-ab2d7c8d1329 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 28 17:44:07 compute-0 nova_compute[187223]: 2025-11-28 17:44:07.369 187227 DEBUG nova.compute.provider_tree [None req-c841d8b5-dff3-43a4-8168-ab2d7c8d1329 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 13 to 16 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:44:08 compute-0 nova_compute[187223]: 2025-11-28 17:44:08.747 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:10 compute-0 podman[213335]: 2025-11-28 17:44:10.220599484 +0000 UTC m=+0.084495207 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 17:44:10 compute-0 nova_compute[187223]: 2025-11-28 17:44:10.651 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:13 compute-0 nova_compute[187223]: 2025-11-28 17:44:13.749 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:15 compute-0 nova_compute[187223]: 2025-11-28 17:44:15.653 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:16 compute-0 podman[213353]: 2025-11-28 17:44:16.198137625 +0000 UTC m=+0.058370762 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd)
Nov 28 17:44:16 compute-0 podman[213354]: 2025-11-28 17:44:16.244709662 +0000 UTC m=+0.090774900 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 17:44:18 compute-0 nova_compute[187223]: 2025-11-28 17:44:18.762 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:19 compute-0 podman[213398]: 2025-11-28 17:44:19.204970161 +0000 UTC m=+0.064585962 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41)
Nov 28 17:44:19 compute-0 ovn_controller[95574]: 2025-11-28T17:44:19Z|00112|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.655 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.720 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.721 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.735 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.815 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.816 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.825 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.825 187227 INFO nova.compute.claims [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.926 187227 DEBUG nova.compute.provider_tree [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.939 187227 DEBUG nova.scheduler.client.report [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.962 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:20 compute-0 nova_compute[187223]: 2025-11-28 17:44:20.963 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.011 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.012 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.033 187227 INFO nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.055 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.213 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.215 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.215 187227 INFO nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Creating image(s)
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.216 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.216 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.217 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.229 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.286 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.287 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.288 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.306 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.361 187227 DEBUG nova.policy [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.377 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.378 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.604 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk 1073741824" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.605 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.606 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.691 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.692 187227 DEBUG nova.virt.disk.api [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.692 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.773 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.775 187227 DEBUG nova.virt.disk.api [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.775 187227 DEBUG nova.objects.instance [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.794 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.795 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Ensure instance console log exists: /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.795 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.795 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.796 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:21 compute-0 nova_compute[187223]: 2025-11-28 17:44:21.983 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Successfully created port: 1cc0848e-9ace-430b-998d-ccc5976c6756 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:44:22 compute-0 nova_compute[187223]: 2025-11-28 17:44:22.977 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Successfully updated port: 1cc0848e-9ace-430b-998d-ccc5976c6756 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:44:22 compute-0 nova_compute[187223]: 2025-11-28 17:44:22.996 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:44:22 compute-0 nova_compute[187223]: 2025-11-28 17:44:22.997 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:44:22 compute-0 nova_compute[187223]: 2025-11-28 17:44:22.997 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:44:23 compute-0 nova_compute[187223]: 2025-11-28 17:44:23.122 187227 DEBUG nova.compute.manager [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-changed-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:44:23 compute-0 nova_compute[187223]: 2025-11-28 17:44:23.122 187227 DEBUG nova.compute.manager [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Refreshing instance network info cache due to event network-changed-1cc0848e-9ace-430b-998d-ccc5976c6756. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:44:23 compute-0 nova_compute[187223]: 2025-11-28 17:44:23.123 187227 DEBUG oslo_concurrency.lockutils [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:44:23 compute-0 nova_compute[187223]: 2025-11-28 17:44:23.332 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:44:23 compute-0 nova_compute[187223]: 2025-11-28 17:44:23.765 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.007 187227 DEBUG nova.network.neutron [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updating instance_info_cache with network_info: [{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.054 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.055 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Instance network_info: |[{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.056 187227 DEBUG oslo_concurrency.lockutils [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.056 187227 DEBUG nova.network.neutron [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Refreshing network info cache for port 1cc0848e-9ace-430b-998d-ccc5976c6756 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.059 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Start _get_guest_xml network_info=[{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.063 187227 WARNING nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.069 187227 DEBUG nova.virt.libvirt.host [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.069 187227 DEBUG nova.virt.libvirt.host [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.073 187227 DEBUG nova.virt.libvirt.host [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.074 187227 DEBUG nova.virt.libvirt.host [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.075 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.076 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.076 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.076 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.076 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.077 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.077 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.077 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.077 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.077 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.078 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.078 187227 DEBUG nova.virt.hardware [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.082 187227 DEBUG nova.virt.libvirt.vif [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:44:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-352843531',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-352843531',id=13,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-zjifhlpx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:44:21Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.082 187227 DEBUG nova.network.os_vif_util [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.083 187227 DEBUG nova.network.os_vif_util [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.083 187227 DEBUG nova.objects.instance [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.266 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <uuid>7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e</uuid>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <name>instance-0000000d</name>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-352843531</nova:name>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:44:24</nova:creationTime>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         <nova:port uuid="1cc0848e-9ace-430b-998d-ccc5976c6756">
Nov 28 17:44:24 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <system>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="serial">7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="uuid">7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </system>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <os>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </os>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <features>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </features>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.config"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:57:a7:6c"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <target dev="tap1cc0848e-9a"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/console.log" append="off"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <video>
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </video>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:44:24 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:44:24 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:44:24 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:44:24 compute-0 nova_compute[187223]: </domain>
Nov 28 17:44:24 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.268 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Preparing to wait for external event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.268 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.269 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.269 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.270 187227 DEBUG nova.virt.libvirt.vif [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:44:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-352843531',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-352843531',id=13,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-zjifhlpx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:44:21Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.270 187227 DEBUG nova.network.os_vif_util [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.271 187227 DEBUG nova.network.os_vif_util [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.272 187227 DEBUG os_vif [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.272 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.273 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.274 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.278 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.279 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1cc0848e-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.279 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1cc0848e-9a, col_values=(('external_ids', {'iface-id': '1cc0848e-9ace-430b-998d-ccc5976c6756', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:a7:6c', 'vm-uuid': '7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:24 compute-0 NetworkManager[55763]: <info>  [1764351864.2835] manager: (tap1cc0848e-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.283 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.289 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.291 187227 INFO os_vif [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a')
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.344 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.344 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.344 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:57:a7:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.345 187227 INFO nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Using config drive
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.710 187227 INFO nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Creating config drive at /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.config
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.721 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpit11xjss execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.853 187227 DEBUG oslo_concurrency.processutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpit11xjss" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:24 compute-0 kernel: tap1cc0848e-9a: entered promiscuous mode
Nov 28 17:44:24 compute-0 NetworkManager[55763]: <info>  [1764351864.9382] manager: (tap1cc0848e-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 28 17:44:24 compute-0 ovn_controller[95574]: 2025-11-28T17:44:24Z|00113|binding|INFO|Claiming lport 1cc0848e-9ace-430b-998d-ccc5976c6756 for this chassis.
Nov 28 17:44:24 compute-0 ovn_controller[95574]: 2025-11-28T17:44:24Z|00114|binding|INFO|1cc0848e-9ace-430b-998d-ccc5976c6756: Claiming fa:16:3e:57:a7:6c 10.100.0.13
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.938 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 ovn_controller[95574]: 2025-11-28T17:44:24Z|00115|binding|INFO|Setting lport 1cc0848e-9ace-430b-998d-ccc5976c6756 ovn-installed in OVS
Nov 28 17:44:24 compute-0 ovn_controller[95574]: 2025-11-28T17:44:24Z|00116|binding|INFO|Setting lport 1cc0848e-9ace-430b-998d-ccc5976c6756 up in Southbound
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.950 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:a7:6c 10.100.0.13'], port_security=['fa:16:3e:57:a7:6c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=1cc0848e-9ace-430b-998d-ccc5976c6756) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.951 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 1cc0848e-9ace-430b-998d-ccc5976c6756 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.951 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.952 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:44:24 compute-0 nova_compute[187223]: 2025-11-28 17:44:24.953 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.964 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fc894e37-12f4-489e-bd7a-fb26c934cc8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.966 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.968 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.968 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[acf506a7-946f-43a5-b932-8e947bf4810a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.969 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2b05fd5f-db4f-4369-bd48-44bea26c8d12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:24 compute-0 systemd-machined[153517]: New machine qemu-10-instance-0000000d.
Nov 28 17:44:24 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:24.981 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[502cdd32-63d5-4d88-823a-a7fbd51633a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:24 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000d.
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.009 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3d667ce9-b7e5-45ef-9919-3c4f3f6a9679]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 systemd-udevd[213457]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:44:25 compute-0 NetworkManager[55763]: <info>  [1764351865.0446] device (tap1cc0848e-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:44:25 compute-0 NetworkManager[55763]: <info>  [1764351865.0458] device (tap1cc0848e-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.049 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[9cadf337-07ce-4e94-8414-fb5078e3de38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 systemd-udevd[213462]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.055 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ea9754-f541-496a-92d2-a11369650ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 NetworkManager[55763]: <info>  [1764351865.0569] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.098 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[918341f6-8de3-4c17-b041-9587c81b338f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.102 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[dcdae07a-ec07-4d8a-aa75-a2575936ffb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 NetworkManager[55763]: <info>  [1764351865.1365] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.142 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d3eb9ef6-5987-4895-ae4a-6dfca3d451b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.172 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d877911e-6e47-4e33-be9a-3fdb28e60f33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503976, 'reachable_time': 37798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213487, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.189 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa3c1f6-ddfe-463f-b7a9-693898cc3424]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503976, 'tstamp': 503976}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213488, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.211 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5f65dabc-6ab3-426b-b8d6-ef4dac8cf155]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503976, 'reachable_time': 37798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213489, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.249 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b396a3ae-773b-488e-991d-5af83a2dfb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.306 187227 DEBUG nova.compute.manager [req-7a9ddb84-97bb-480d-86e4-26d36f5a3ed1 req-b6a6cb1e-3b59-4cda-b74d-bd39d3cc9c41 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.306 187227 DEBUG oslo_concurrency.lockutils [req-7a9ddb84-97bb-480d-86e4-26d36f5a3ed1 req-b6a6cb1e-3b59-4cda-b74d-bd39d3cc9c41 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.307 187227 DEBUG oslo_concurrency.lockutils [req-7a9ddb84-97bb-480d-86e4-26d36f5a3ed1 req-b6a6cb1e-3b59-4cda-b74d-bd39d3cc9c41 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.307 187227 DEBUG oslo_concurrency.lockutils [req-7a9ddb84-97bb-480d-86e4-26d36f5a3ed1 req-b6a6cb1e-3b59-4cda-b74d-bd39d3cc9c41 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.307 187227 DEBUG nova.compute.manager [req-7a9ddb84-97bb-480d-86e4-26d36f5a3ed1 req-b6a6cb1e-3b59-4cda-b74d-bd39d3cc9c41 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Processing event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.333 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f3741ba5-3c83-46c3-a0f5-82982b254013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.335 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.335 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.336 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:25 compute-0 NetworkManager[55763]: <info>  [1764351865.3790] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 28 17:44:25 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.378 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.382 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.383 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:25 compute-0 ovn_controller[95574]: 2025-11-28T17:44:25Z|00117|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.384 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.390 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[07561ed2-cf0d-4db5-9b15-c03595f3a749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.392 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:44:25 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:25.393 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.398 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.454 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351865.4537964, 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.454 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] VM Started (Lifecycle Event)
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.457 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.462 187227 DEBUG nova.network.neutron [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updated VIF entry in instance network info cache for port 1cc0848e-9ace-430b-998d-ccc5976c6756. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.463 187227 DEBUG nova.network.neutron [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updating instance_info_cache with network_info: [{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.465 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.469 187227 INFO nova.virt.libvirt.driver [-] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Instance spawned successfully.
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.470 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.477 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.482 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.486 187227 DEBUG oslo_concurrency.lockutils [req-322a46e8-4570-49d0-80da-48189ce9b39b req-069b619d-437a-44c8-9ae8-6ee784e7467d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.492 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.493 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.494 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.494 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.495 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.495 187227 DEBUG nova.virt.libvirt.driver [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.500 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.501 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351865.4539785, 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.501 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] VM Paused (Lifecycle Event)
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.534 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.537 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351865.4631631, 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.538 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] VM Resumed (Lifecycle Event)
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.560 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.564 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.570 187227 INFO nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Took 4.36 seconds to spawn the instance on the hypervisor.
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.571 187227 DEBUG nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.581 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.631 187227 INFO nova.compute.manager [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Took 4.85 seconds to build instance.
Nov 28 17:44:25 compute-0 nova_compute[187223]: 2025-11-28 17:44:25.649 187227 DEBUG oslo_concurrency.lockutils [None req-0a2e95f0-2c34-4ca6-9fdc-dc4125b2e30c 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:25 compute-0 podman[213528]: 2025-11-28 17:44:25.771237665 +0000 UTC m=+0.024245196 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:44:25 compute-0 podman[213528]: 2025-11-28 17:44:25.915944842 +0000 UTC m=+0.168952353 container create 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:44:25 compute-0 systemd[1]: Started libpod-conmon-7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a.scope.
Nov 28 17:44:26 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:44:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521f634775e69d1f908b0918dfec05d8cf2a2ad0b143056db77f87f624e2145d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:44:26 compute-0 podman[213528]: 2025-11-28 17:44:26.046910462 +0000 UTC m=+0.299918013 container init 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:44:26 compute-0 podman[213528]: 2025-11-28 17:44:26.054445559 +0000 UTC m=+0.307453080 container start 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:44:26 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [NOTICE]   (213548) : New worker (213550) forked
Nov 28 17:44:26 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [NOTICE]   (213548) : Loading success.
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.402 187227 DEBUG nova.compute.manager [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.404 187227 DEBUG oslo_concurrency.lockutils [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.404 187227 DEBUG oslo_concurrency.lockutils [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.404 187227 DEBUG oslo_concurrency.lockutils [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.405 187227 DEBUG nova.compute.manager [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:44:27 compute-0 nova_compute[187223]: 2025-11-28 17:44:27.405 187227 WARNING nova.compute.manager [req-565257b5-f7a5-46b5-8bd3-5199a254822f req-5f6703bb-0aed-418d-bf45-e9fab0872aa1 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state None.
Nov 28 17:44:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:27.693 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:27.694 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:27.695 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:28 compute-0 nova_compute[187223]: 2025-11-28 17:44:28.801 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:29 compute-0 nova_compute[187223]: 2025-11-28 17:44:29.282 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:29 compute-0 podman[197556]: time="2025-11-28T17:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:44:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:44:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: ERROR   17:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: ERROR   17:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: ERROR   17:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: ERROR   17:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: ERROR   17:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:44:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:44:33 compute-0 nova_compute[187223]: 2025-11-28 17:44:33.804 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:34 compute-0 nova_compute[187223]: 2025-11-28 17:44:34.285 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:34 compute-0 nova_compute[187223]: 2025-11-28 17:44:34.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:35 compute-0 podman[213559]: 2025-11-28 17:44:35.213425351 +0000 UTC m=+0.068509259 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:44:35 compute-0 nova_compute[187223]: 2025-11-28 17:44:35.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:35 compute-0 nova_compute[187223]: 2025-11-28 17:44:35.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:44:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:35.919 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:44:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:35.921 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:44:35 compute-0 nova_compute[187223]: 2025-11-28 17:44:35.923 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.707 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.709 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.795 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.863 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.866 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:44:37 compute-0 nova_compute[187223]: 2025-11-28 17:44:37.933 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.132 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.134 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.31370162963867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.134 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.135 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.227 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.228 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.228 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.271 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.285 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.310 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.311 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:44:38 compute-0 ovn_controller[95574]: 2025-11-28T17:44:38Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:57:a7:6c 10.100.0.13
Nov 28 17:44:38 compute-0 ovn_controller[95574]: 2025-11-28T17:44:38Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:57:a7:6c 10.100.0.13
Nov 28 17:44:38 compute-0 nova_compute[187223]: 2025-11-28 17:44:38.807 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:44:38.924 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:44:39 compute-0 nova_compute[187223]: 2025-11-28 17:44:39.287 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:39 compute-0 nova_compute[187223]: 2025-11-28 17:44:39.306 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:39 compute-0 nova_compute[187223]: 2025-11-28 17:44:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:39 compute-0 nova_compute[187223]: 2025-11-28 17:44:39.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:44:39 compute-0 nova_compute[187223]: 2025-11-28 17:44:39.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:44:40 compute-0 nova_compute[187223]: 2025-11-28 17:44:40.324 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:44:40 compute-0 nova_compute[187223]: 2025-11-28 17:44:40.325 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:44:40 compute-0 nova_compute[187223]: 2025-11-28 17:44:40.325 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:44:40 compute-0 nova_compute[187223]: 2025-11-28 17:44:40.325 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:44:41 compute-0 podman[213612]: 2025-11-28 17:44:41.224271398 +0000 UTC m=+0.082944235 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 28 17:44:42 compute-0 nova_compute[187223]: 2025-11-28 17:44:42.549 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updating instance_info_cache with network_info: [{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:44:42 compute-0 nova_compute[187223]: 2025-11-28 17:44:42.574 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:44:42 compute-0 nova_compute[187223]: 2025-11-28 17:44:42.575 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:44:42 compute-0 nova_compute[187223]: 2025-11-28 17:44:42.575 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:42 compute-0 nova_compute[187223]: 2025-11-28 17:44:42.576 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:43 compute-0 nova_compute[187223]: 2025-11-28 17:44:43.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:43 compute-0 nova_compute[187223]: 2025-11-28 17:44:43.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:44:43 compute-0 nova_compute[187223]: 2025-11-28 17:44:43.809 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:44 compute-0 nova_compute[187223]: 2025-11-28 17:44:44.290 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:47 compute-0 podman[213632]: 2025-11-28 17:44:47.243393011 +0000 UTC m=+0.089143364 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 17:44:47 compute-0 podman[213633]: 2025-11-28 17:44:47.347824994 +0000 UTC m=+0.185624760 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:44:48 compute-0 nova_compute[187223]: 2025-11-28 17:44:48.856 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:49 compute-0 nova_compute[187223]: 2025-11-28 17:44:49.292 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:50 compute-0 podman[213678]: 2025-11-28 17:44:50.236546538 +0000 UTC m=+0.081940908 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 17:44:53 compute-0 nova_compute[187223]: 2025-11-28 17:44:53.858 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:54 compute-0 nova_compute[187223]: 2025-11-28 17:44:54.294 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:58 compute-0 nova_compute[187223]: 2025-11-28 17:44:58.862 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:59 compute-0 nova_compute[187223]: 2025-11-28 17:44:59.296 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:44:59 compute-0 podman[197556]: time="2025-11-28T17:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:44:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:44:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: ERROR   17:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: ERROR   17:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: ERROR   17:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: ERROR   17:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: ERROR   17:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:45:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:45:03 compute-0 nova_compute[187223]: 2025-11-28 17:45:03.355 187227 DEBUG nova.compute.manager [None req-37f10e22-6684-4e28-ac82-8e6939329da2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 28 17:45:03 compute-0 nova_compute[187223]: 2025-11-28 17:45:03.396 187227 DEBUG nova.compute.provider_tree [None req-37f10e22-6684-4e28-ac82-8e6939329da2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 16 to 18 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:45:03 compute-0 nova_compute[187223]: 2025-11-28 17:45:03.866 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:04 compute-0 nova_compute[187223]: 2025-11-28 17:45:04.299 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:05 compute-0 ovn_controller[95574]: 2025-11-28T17:45:05Z|00118|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:45:06 compute-0 podman[213699]: 2025-11-28 17:45:06.196297924 +0000 UTC m=+0.055096761 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:45:07 compute-0 nova_compute[187223]: 2025-11-28 17:45:07.334 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Check if temp file /var/lib/nova/instances/tmplw5yh6k4 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 17:45:07 compute-0 nova_compute[187223]: 2025-11-28 17:45:07.334 187227 DEBUG nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplw5yh6k4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 17:45:08 compute-0 nova_compute[187223]: 2025-11-28 17:45:08.869 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:09 compute-0 nova_compute[187223]: 2025-11-28 17:45:09.295 187227 DEBUG oslo_concurrency.processutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:45:09 compute-0 nova_compute[187223]: 2025-11-28 17:45:09.325 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:09 compute-0 nova_compute[187223]: 2025-11-28 17:45:09.396 187227 DEBUG oslo_concurrency.processutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:45:09 compute-0 nova_compute[187223]: 2025-11-28 17:45:09.397 187227 DEBUG oslo_concurrency.processutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:45:09 compute-0 nova_compute[187223]: 2025-11-28 17:45:09.458 187227 DEBUG oslo_concurrency.processutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:45:12 compute-0 podman[213729]: 2025-11-28 17:45:12.199139671 +0000 UTC m=+0.055051920 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 17:45:13 compute-0 sshd-session[213748]: Accepted publickey for nova from 192.168.122.101 port 52470 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:45:13 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:45:13 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:45:13 compute-0 systemd-logind[788]: New session 35 of user nova.
Nov 28 17:45:13 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:45:13 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:45:13 compute-0 systemd[213752]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:45:13 compute-0 systemd[213752]: Queued start job for default target Main User Target.
Nov 28 17:45:13 compute-0 systemd[213752]: Created slice User Application Slice.
Nov 28 17:45:13 compute-0 systemd[213752]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:45:13 compute-0 systemd[213752]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:45:13 compute-0 systemd[213752]: Reached target Paths.
Nov 28 17:45:13 compute-0 systemd[213752]: Reached target Timers.
Nov 28 17:45:13 compute-0 systemd[213752]: Starting D-Bus User Message Bus Socket...
Nov 28 17:45:13 compute-0 systemd[213752]: Starting Create User's Volatile Files and Directories...
Nov 28 17:45:13 compute-0 systemd[213752]: Finished Create User's Volatile Files and Directories.
Nov 28 17:45:13 compute-0 systemd[213752]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:45:13 compute-0 systemd[213752]: Reached target Sockets.
Nov 28 17:45:13 compute-0 systemd[213752]: Reached target Basic System.
Nov 28 17:45:13 compute-0 systemd[213752]: Reached target Main User Target.
Nov 28 17:45:13 compute-0 systemd[213752]: Startup finished in 156ms.
Nov 28 17:45:13 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:45:13 compute-0 systemd[1]: Started Session 35 of User nova.
Nov 28 17:45:13 compute-0 sshd-session[213748]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:45:13 compute-0 sshd-session[213767]: Received disconnect from 192.168.122.101 port 52470:11: disconnected by user
Nov 28 17:45:13 compute-0 sshd-session[213767]: Disconnected from user nova 192.168.122.101 port 52470
Nov 28 17:45:13 compute-0 sshd-session[213748]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:45:13 compute-0 systemd-logind[788]: Session 35 logged out. Waiting for processes to exit.
Nov 28 17:45:13 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Nov 28 17:45:13 compute-0 systemd-logind[788]: Removed session 35.
Nov 28 17:45:13 compute-0 nova_compute[187223]: 2025-11-28 17:45:13.871 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:14 compute-0 nova_compute[187223]: 2025-11-28 17:45:14.329 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.547 187227 DEBUG nova.compute.manager [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.547 187227 DEBUG oslo_concurrency.lockutils [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.548 187227 DEBUG oslo_concurrency.lockutils [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.548 187227 DEBUG oslo_concurrency.lockutils [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.548 187227 DEBUG nova.compute.manager [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.548 187227 DEBUG nova.compute.manager [req-4ae33114-f604-4c07-bbc9-840b1ca732ed req-5486d318-14b1-40f1-91b4-abaae8049659 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.896 187227 INFO nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Took 6.44 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.896 187227 DEBUG nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.913 187227 DEBUG nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplw5yh6k4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(151d2a5c-26e6-40ef-83d8-1b59dd26a4fd),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.935 187227 DEBUG nova.objects.instance [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.937 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.939 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.939 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.957 187227 DEBUG nova.virt.libvirt.vif [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:44:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-352843531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-352843531',id=13,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:44:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-zjifhlpx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:44:25Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.958 187227 DEBUG nova.network.os_vif_util [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.959 187227 DEBUG nova.network.os_vif_util [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.959 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 17:45:15 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:57:a7:6c"/>
Nov 28 17:45:15 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 17:45:15 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:45:15 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 17:45:15 compute-0 nova_compute[187223]:   <target dev="tap1cc0848e-9a"/>
Nov 28 17:45:15 compute-0 nova_compute[187223]: </interface>
Nov 28 17:45:15 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 17:45:15 compute-0 nova_compute[187223]: 2025-11-28 17:45:15.960 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 17:45:16 compute-0 nova_compute[187223]: 2025-11-28 17:45:16.442 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:45:16 compute-0 nova_compute[187223]: 2025-11-28 17:45:16.444 187227 INFO nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 17:45:16 compute-0 nova_compute[187223]: 2025-11-28 17:45:16.519 187227 INFO nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.022 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.023 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.528 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.529 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.633 187227 DEBUG nova.compute.manager [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.634 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.634 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.635 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.635 187227 DEBUG nova.compute.manager [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.636 187227 WARNING nova.compute.manager [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state migrating.
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.636 187227 DEBUG nova.compute.manager [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-changed-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.636 187227 DEBUG nova.compute.manager [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Refreshing instance network info cache due to event network-changed-1cc0848e-9ace-430b-998d-ccc5976c6756. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.637 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.637 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:45:17 compute-0 nova_compute[187223]: 2025-11-28 17:45:17.638 187227 DEBUG nova.network.neutron [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Refreshing network info cache for port 1cc0848e-9ace-430b-998d-ccc5976c6756 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.021 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351918.0205102, 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.021 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] VM Paused (Lifecycle Event)
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.034 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.035 187227 DEBUG nova.virt.libvirt.migration [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.043 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.050 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.083 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 17:45:18 compute-0 kernel: tap1cc0848e-9a (unregistering): left promiscuous mode
Nov 28 17:45:18 compute-0 NetworkManager[55763]: <info>  [1764351918.1620] device (tap1cc0848e-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.174 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:18 compute-0 ovn_controller[95574]: 2025-11-28T17:45:18Z|00119|binding|INFO|Releasing lport 1cc0848e-9ace-430b-998d-ccc5976c6756 from this chassis (sb_readonly=0)
Nov 28 17:45:18 compute-0 ovn_controller[95574]: 2025-11-28T17:45:18Z|00120|binding|INFO|Setting lport 1cc0848e-9ace-430b-998d-ccc5976c6756 down in Southbound
Nov 28 17:45:18 compute-0 ovn_controller[95574]: 2025-11-28T17:45:18Z|00121|binding|INFO|Removing iface tap1cc0848e-9a ovn-installed in OVS
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.176 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.181 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:a7:6c 10.100.0.13'], port_security=['fa:16:3e:57:a7:6c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=1cc0848e-9ace-430b-998d-ccc5976c6756) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.182 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 1cc0848e-9ace-430b-998d-ccc5976c6756 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.184 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.185 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7c14276b-1c1b-4aff-8f13-6272d8b5b6a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.186 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:18 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 28 17:45:18 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000d.scope: Consumed 15.278s CPU time.
Nov 28 17:45:18 compute-0 systemd-machined[153517]: Machine qemu-10-instance-0000000d terminated.
Nov 28 17:45:18 compute-0 podman[213773]: 2025-11-28 17:45:18.243960399 +0000 UTC m=+0.097776691 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 17:45:18 compute-0 podman[213774]: 2025-11-28 17:45:18.274329812 +0000 UTC m=+0.122705165 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [NOTICE]   (213548) : haproxy version is 2.8.14-c23fe91
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [NOTICE]   (213548) : path to executable is /usr/sbin/haproxy
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [WARNING]  (213548) : Exiting Master process...
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [WARNING]  (213548) : Exiting Master process...
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [ALERT]    (213548) : Current worker (213550) exited with code 143 (Terminated)
Nov 28 17:45:18 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[213544]: [WARNING]  (213548) : All workers exited. Exiting... (0)
Nov 28 17:45:18 compute-0 systemd[1]: libpod-7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a.scope: Deactivated successfully.
Nov 28 17:45:18 compute-0 podman[213841]: 2025-11-28 17:45:18.351250081 +0000 UTC m=+0.054204367 container died 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:45:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a-userdata-shm.mount: Deactivated successfully.
Nov 28 17:45:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-521f634775e69d1f908b0918dfec05d8cf2a2ad0b143056db77f87f624e2145d-merged.mount: Deactivated successfully.
Nov 28 17:45:18 compute-0 podman[213841]: 2025-11-28 17:45:18.408945582 +0000 UTC m=+0.111899858 container cleanup 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:45:18 compute-0 systemd[1]: libpod-conmon-7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a.scope: Deactivated successfully.
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.418 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.420 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.420 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 17:45:18 compute-0 podman[213885]: 2025-11-28 17:45:18.500259986 +0000 UTC m=+0.059988176 container remove 7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.507 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[badf4ae4-0848-41f0-9819-301130188778]: (4, ('Fri Nov 28 05:45:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a)\n7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a\nFri Nov 28 05:45:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a)\n7107bafa037e683aba0ecc931c5d03702a335e7b38e40d6bd0834035f50dff5a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.509 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb2f892-ccf0-4341-a7a1-ceeb5b645c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.510 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.541 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:18 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.544 187227 DEBUG nova.virt.libvirt.guest [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e' (instance-0000000d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.545 187227 INFO nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migration operation has completed
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.545 187227 INFO nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] _post_live_migration() is started..
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.558 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.564 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[adf98ae3-23fd-4280-a1cc-91ac7b61b82c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.578 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fed32d10-90d3-43b2-b798-b5c3a125082f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.580 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a20faa1b-cfe7-4f1f-a654-3190c5b09c3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.602 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a45d02-5198-4446-8e30-099edc358333]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503967, 'reachable_time': 27520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213904, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.606 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:45:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:45:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:18.606 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[e34c349c-1393-4092-8aa5-026b1e17109d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.793 187227 DEBUG nova.network.neutron [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updated VIF entry in instance network info cache for port 1cc0848e-9ace-430b-998d-ccc5976c6756. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.794 187227 DEBUG nova.network.neutron [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Updating instance_info_cache with network_info: [{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.817 187227 DEBUG oslo_concurrency.lockutils [req-0de171f6-fb57-4954-aa08-061f9589755d req-a143e630-296c-4d06-bac5-b16245283563 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:45:18 compute-0 nova_compute[187223]: 2025-11-28 17:45:18.873 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.133 187227 DEBUG nova.compute.manager [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.134 187227 DEBUG oslo_concurrency.lockutils [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.134 187227 DEBUG oslo_concurrency.lockutils [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.134 187227 DEBUG oslo_concurrency.lockutils [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.135 187227 DEBUG nova.compute.manager [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.135 187227 DEBUG nova.compute.manager [req-5132ce2f-c3b2-4e08-9b68-ba1466d9cb0c req-4d29ffbd-0b64-4840-9b9f-9e46505740d2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.220 187227 DEBUG nova.network.neutron [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Activated binding for port 1cc0848e-9ace-430b-998d-ccc5976c6756 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.221 187227 DEBUG nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.222 187227 DEBUG nova.virt.libvirt.vif [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:44:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-352843531',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-352843531',id=13,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:44:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-zjifhlpx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:45:05Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.222 187227 DEBUG nova.network.os_vif_util [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "1cc0848e-9ace-430b-998d-ccc5976c6756", "address": "fa:16:3e:57:a7:6c", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cc0848e-9a", "ovs_interfaceid": "1cc0848e-9ace-430b-998d-ccc5976c6756", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.223 187227 DEBUG nova.network.os_vif_util [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.223 187227 DEBUG os_vif [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.225 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.225 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cc0848e-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.227 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.230 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.235 187227 INFO os_vif [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:a7:6c,bridge_name='br-int',has_traffic_filtering=True,id=1cc0848e-9ace-430b-998d-ccc5976c6756,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cc0848e-9a')
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.235 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.236 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.236 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.237 187227 DEBUG nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.238 187227 INFO nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Deleting instance files /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e_del
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.239 187227 INFO nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Deletion of /var/lib/nova/instances/7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e_del complete
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.723 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.724 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.724 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.724 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.724 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.724 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-unplugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.725 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.725 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.725 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.725 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.726 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.726 187227 WARNING nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state migrating.
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.726 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.726 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.726 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.727 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.727 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.727 187227 WARNING nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state migrating.
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.727 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.727 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.728 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.728 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.728 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.728 187227 WARNING nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state migrating.
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.729 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.729 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.729 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.729 187227 DEBUG oslo_concurrency.lockutils [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.730 187227 DEBUG nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] No waiting events found dispatching network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:45:19 compute-0 nova_compute[187223]: 2025-11-28 17:45:19.730 187227 WARNING nova.compute.manager [req-f7467b34-fd05-430b-843d-6452c4e84604 req-616a8f90-e3b8-4b23-8081-0a58fa996857 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Received unexpected event network-vif-plugged-1cc0848e-9ace-430b-998d-ccc5976c6756 for instance with vm_state active and task_state migrating.
Nov 28 17:45:21 compute-0 podman[213905]: 2025-11-28 17:45:21.230995908 +0000 UTC m=+0.081180066 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 17:45:23 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:45:23 compute-0 systemd[213752]: Activating special unit Exit the Session...
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped target Main User Target.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped target Basic System.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped target Paths.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped target Sockets.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped target Timers.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:45:23 compute-0 systemd[213752]: Closed D-Bus User Message Bus Socket.
Nov 28 17:45:23 compute-0 systemd[213752]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:45:23 compute-0 systemd[213752]: Removed slice User Application Slice.
Nov 28 17:45:23 compute-0 systemd[213752]: Reached target Shutdown.
Nov 28 17:45:23 compute-0 systemd[213752]: Finished Exit the Session.
Nov 28 17:45:23 compute-0 systemd[213752]: Reached target Exit the Session.
Nov 28 17:45:23 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:45:23 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:45:23 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:45:23 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:45:23 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:45:23 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:45:23 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:45:23 compute-0 nova_compute[187223]: 2025-11-28 17:45:23.875 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:24 compute-0 nova_compute[187223]: 2025-11-28 17:45:24.228 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.447 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.447 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.448 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.476 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.476 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.476 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.477 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.651 187227 WARNING nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.652 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.34049606323242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.652 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.653 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.802 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration for instance 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.843 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.881 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration 151d2a5c-26e6-40ef-83d8-1b59dd26a4fd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.882 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.882 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.922 187227 DEBUG nova.compute.provider_tree [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.942 187227 DEBUG nova.scheduler.client.report [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.972 187227 DEBUG nova.compute.resource_tracker [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.972 187227 DEBUG oslo_concurrency.lockutils [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:25 compute-0 nova_compute[187223]: 2025-11-28 17:45:25.977 187227 INFO nova.compute.manager [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 17:45:26 compute-0 nova_compute[187223]: 2025-11-28 17:45:26.069 187227 INFO nova.scheduler.client.report [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Deleted allocation for migration 151d2a5c-26e6-40ef-83d8-1b59dd26a4fd
Nov 28 17:45:26 compute-0 nova_compute[187223]: 2025-11-28 17:45:26.070 187227 DEBUG nova.virt.libvirt.driver [None req-bf6efcff-b40d-406a-82f5-2d5e2d2332c0 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 17:45:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:27.695 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:27.696 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:45:27.697 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:28 compute-0 nova_compute[187223]: 2025-11-28 17:45:28.878 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:29 compute-0 nova_compute[187223]: 2025-11-28 17:45:29.231 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:29 compute-0 podman[197556]: time="2025-11-28T17:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:45:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:45:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: ERROR   17:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: ERROR   17:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: ERROR   17:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: ERROR   17:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: ERROR   17:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:45:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:45:33 compute-0 nova_compute[187223]: 2025-11-28 17:45:33.417 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764351918.415545, 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:45:33 compute-0 nova_compute[187223]: 2025-11-28 17:45:33.417 187227 INFO nova.compute.manager [-] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] VM Stopped (Lifecycle Event)
Nov 28 17:45:33 compute-0 nova_compute[187223]: 2025-11-28 17:45:33.445 187227 DEBUG nova.compute.manager [None req-a25fe820-9b2e-435a-8b05-f93bf9692457 - - - - - -] [instance: 7b0f3d6d-b1cd-41fb-b10c-09ba82a0e68e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:45:33 compute-0 nova_compute[187223]: 2025-11-28 17:45:33.879 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:34 compute-0 nova_compute[187223]: 2025-11-28 17:45:34.234 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:36 compute-0 nova_compute[187223]: 2025-11-28 17:45:36.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:36 compute-0 nova_compute[187223]: 2025-11-28 17:45:36.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:36 compute-0 nova_compute[187223]: 2025-11-28 17:45:36.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:45:37 compute-0 podman[213929]: 2025-11-28 17:45:37.205065527 +0000 UTC m=+0.061865818 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.704 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.704 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.704 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.847 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.849 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.34049606323242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.850 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.850 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.882 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.908 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.908 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.929 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.946 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.947 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:45:38 compute-0 nova_compute[187223]: 2025-11-28 17:45:38.948 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.236 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.948 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.949 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.949 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.966 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:45:39 compute-0 nova_compute[187223]: 2025-11-28 17:45:39.967 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:43 compute-0 podman[213953]: 2025-11-28 17:45:43.210466864 +0000 UTC m=+0.066079772 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:45:43 compute-0 nova_compute[187223]: 2025-11-28 17:45:43.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:43 compute-0 nova_compute[187223]: 2025-11-28 17:45:43.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:43 compute-0 nova_compute[187223]: 2025-11-28 17:45:43.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:45:43 compute-0 nova_compute[187223]: 2025-11-28 17:45:43.883 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:44 compute-0 nova_compute[187223]: 2025-11-28 17:45:44.238 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:48 compute-0 nova_compute[187223]: 2025-11-28 17:45:48.886 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:49 compute-0 podman[213975]: 2025-11-28 17:45:49.219370818 +0000 UTC m=+0.064808660 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 28 17:45:49 compute-0 nova_compute[187223]: 2025-11-28 17:45:49.239 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:49 compute-0 podman[213976]: 2025-11-28 17:45:49.249590734 +0000 UTC m=+0.091679849 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:45:52 compute-0 podman[214020]: 2025-11-28 17:45:52.249591618 +0000 UTC m=+0.094516711 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Nov 28 17:45:53 compute-0 nova_compute[187223]: 2025-11-28 17:45:53.888 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:54 compute-0 nova_compute[187223]: 2025-11-28 17:45:54.278 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:58 compute-0 nova_compute[187223]: 2025-11-28 17:45:58.892 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:59 compute-0 nova_compute[187223]: 2025-11-28 17:45:59.280 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:45:59 compute-0 podman[197556]: time="2025-11-28T17:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:45:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:45:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: ERROR   17:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: ERROR   17:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: ERROR   17:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: ERROR   17:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: ERROR   17:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:46:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:46:03 compute-0 nova_compute[187223]: 2025-11-28 17:46:03.893 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:04 compute-0 nova_compute[187223]: 2025-11-28 17:46:04.283 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:05 compute-0 ovn_controller[95574]: 2025-11-28T17:46:05Z|00122|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:46:08 compute-0 podman[214042]: 2025-11-28 17:46:08.192152613 +0000 UTC m=+0.055738606 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:46:08 compute-0 nova_compute[187223]: 2025-11-28 17:46:08.207 187227 DEBUG nova.compute.manager [None req-57fbf0c9-2001-4639-bb52-224dbc7e2335 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 28 17:46:08 compute-0 nova_compute[187223]: 2025-11-28 17:46:08.433 187227 DEBUG nova.compute.provider_tree [None req-57fbf0c9-2001-4639-bb52-224dbc7e2335 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 18 to 21 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:46:08 compute-0 nova_compute[187223]: 2025-11-28 17:46:08.896 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:09 compute-0 nova_compute[187223]: 2025-11-28 17:46:09.285 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.251 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.252 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.269 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.369 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.369 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.376 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.376 187227 INFO nova.compute.claims [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.492 187227 DEBUG nova.compute.provider_tree [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.506 187227 DEBUG nova.scheduler.client.report [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.528 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.529 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.572 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.572 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.592 187227 INFO nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.612 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.713 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.714 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.715 187227 INFO nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Creating image(s)
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.715 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.716 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.716 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.730 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.827 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.829 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.830 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.856 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.936 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.937 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.976 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.977 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:12 compute-0 nova_compute[187223]: 2025-11-28 17:46:12.977 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.036 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.037 187227 DEBUG nova.virt.disk.api [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.038 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.101 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.102 187227 DEBUG nova.virt.disk.api [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.102 187227 DEBUG nova.objects.instance [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.134 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.135 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Ensure instance console log exists: /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.135 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.135 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.136 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:13 compute-0 nova_compute[187223]: 2025-11-28 17:46:13.899 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:14 compute-0 podman[214082]: 2025-11-28 17:46:14.203845231 +0000 UTC m=+0.069502666 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 17:46:14 compute-0 nova_compute[187223]: 2025-11-28 17:46:14.288 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:14 compute-0 nova_compute[187223]: 2025-11-28 17:46:14.436 187227 DEBUG nova.policy [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:46:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:16.202 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:46:16 compute-0 nova_compute[187223]: 2025-11-28 17:46:16.203 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:16.204 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:46:16 compute-0 nova_compute[187223]: 2025-11-28 17:46:16.305 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Successfully created port: d5f25ad5-b616-4614-9ff6-76e2a057dc48 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.031 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Successfully updated port: d5f25ad5-b616-4614-9ff6-76e2a057dc48 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.066 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.067 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.067 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.216 187227 DEBUG nova.compute.manager [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-changed-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.216 187227 DEBUG nova.compute.manager [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Refreshing instance network info cache due to event network-changed-d5f25ad5-b616-4614-9ff6-76e2a057dc48. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.216 187227 DEBUG oslo_concurrency.lockutils [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:46:17 compute-0 nova_compute[187223]: 2025-11-28 17:46:17.310 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:46:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:18.206 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:18 compute-0 nova_compute[187223]: 2025-11-28 17:46:18.923 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.290 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.575 187227 DEBUG nova.network.neutron [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updating instance_info_cache with network_info: [{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.597 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.598 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Instance network_info: |[{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.600 187227 DEBUG oslo_concurrency.lockutils [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.600 187227 DEBUG nova.network.neutron [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Refreshing network info cache for port d5f25ad5-b616-4614-9ff6-76e2a057dc48 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.606 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Start _get_guest_xml network_info=[{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.614 187227 WARNING nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.619 187227 DEBUG nova.virt.libvirt.host [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.619 187227 DEBUG nova.virt.libvirt.host [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.636 187227 DEBUG nova.virt.libvirt.host [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.638 187227 DEBUG nova.virt.libvirt.host [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.639 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.639 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.640 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.640 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.641 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.641 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.641 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.641 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.642 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.642 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.642 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.643 187227 DEBUG nova.virt.hardware [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.647 187227 DEBUG nova.virt.libvirt.vif [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:46:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1838714708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1838714708',id=15,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-qlm8oug8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:46:12Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3a554ced-b6a5-4a0f-b573-c1e3c6cf8382,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.648 187227 DEBUG nova.network.os_vif_util [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.648 187227 DEBUG nova.network.os_vif_util [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.649 187227 DEBUG nova.objects.instance [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.664 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <uuid>3a554ced-b6a5-4a0f-b573-c1e3c6cf8382</uuid>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <name>instance-0000000f</name>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-1838714708</nova:name>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:46:19</nova:creationTime>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         <nova:port uuid="d5f25ad5-b616-4614-9ff6-76e2a057dc48">
Nov 28 17:46:19 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <system>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="serial">3a554ced-b6a5-4a0f-b573-c1e3c6cf8382</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="uuid">3a554ced-b6a5-4a0f-b573-c1e3c6cf8382</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </system>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <os>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </os>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <features>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </features>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.config"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:11:ea:77"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <target dev="tapd5f25ad5-b6"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/console.log" append="off"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <video>
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </video>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:46:19 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:46:19 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:46:19 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:46:19 compute-0 nova_compute[187223]: </domain>
Nov 28 17:46:19 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.665 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Preparing to wait for external event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.666 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.666 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.666 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.667 187227 DEBUG nova.virt.libvirt.vif [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:46:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1838714708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1838714708',id=15,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-qlm8oug8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:46:12Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3a554ced-b6a5-4a0f-b573-c1e3c6cf8382,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.667 187227 DEBUG nova.network.os_vif_util [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.668 187227 DEBUG nova.network.os_vif_util [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.668 187227 DEBUG os_vif [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.669 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.669 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.670 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.673 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.673 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f25ad5-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.674 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd5f25ad5-b6, col_values=(('external_ids', {'iface-id': 'd5f25ad5-b616-4614-9ff6-76e2a057dc48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:ea:77', 'vm-uuid': '3a554ced-b6a5-4a0f-b573-c1e3c6cf8382'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.675 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 NetworkManager[55763]: <info>  [1764351979.6770] manager: (tapd5f25ad5-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.678 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.684 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.686 187227 INFO os_vif [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6')
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.736 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.737 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.737 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:11:ea:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:46:19 compute-0 nova_compute[187223]: 2025-11-28 17:46:19.737 187227 INFO nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Using config drive
Nov 28 17:46:20 compute-0 podman[214104]: 2025-11-28 17:46:20.207372061 +0000 UTC m=+0.070397442 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:46:20 compute-0 podman[214105]: 2025-11-28 17:46:20.27185348 +0000 UTC m=+0.118759724 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.720 187227 INFO nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Creating config drive at /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.config
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.725 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprqz3ecey execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.852 187227 DEBUG oslo_concurrency.processutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprqz3ecey" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:20 compute-0 kernel: tapd5f25ad5-b6: entered promiscuous mode
Nov 28 17:46:20 compute-0 NetworkManager[55763]: <info>  [1764351980.9221] manager: (tapd5f25ad5-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Nov 28 17:46:20 compute-0 ovn_controller[95574]: 2025-11-28T17:46:20Z|00123|binding|INFO|Claiming lport d5f25ad5-b616-4614-9ff6-76e2a057dc48 for this chassis.
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.922 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:20 compute-0 ovn_controller[95574]: 2025-11-28T17:46:20Z|00124|binding|INFO|d5f25ad5-b616-4614-9ff6-76e2a057dc48: Claiming fa:16:3e:11:ea:77 10.100.0.12
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.930 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ea:77 10.100.0.12'], port_security=['fa:16:3e:11:ea:77 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3a554ced-b6a5-4a0f-b573-c1e3c6cf8382', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d5f25ad5-b616-4614-9ff6-76e2a057dc48) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.932 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d5f25ad5-b616-4614-9ff6-76e2a057dc48 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.933 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.936 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:20 compute-0 ovn_controller[95574]: 2025-11-28T17:46:20Z|00125|binding|INFO|Setting lport d5f25ad5-b616-4614-9ff6-76e2a057dc48 ovn-installed in OVS
Nov 28 17:46:20 compute-0 ovn_controller[95574]: 2025-11-28T17:46:20Z|00126|binding|INFO|Setting lport d5f25ad5-b616-4614-9ff6-76e2a057dc48 up in Southbound
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.939 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:20 compute-0 nova_compute[187223]: 2025-11-28 17:46:20.941 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.948 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[89e74831-d18c-4b1b-8321-2838314883ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.949 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.952 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.952 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa9ba8e-78fa-4f80-ad41-808326456cac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.954 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9008d850-9b4a-4983-809b-ff6b6e3d2496]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:20 compute-0 systemd-udevd[214170]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.966 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[5e8fdc37-2027-4a19-af63-dd1801f9f30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:20 compute-0 systemd-machined[153517]: New machine qemu-11-instance-0000000f.
Nov 28 17:46:20 compute-0 NetworkManager[55763]: <info>  [1764351980.9723] device (tapd5f25ad5-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:46:20 compute-0 NetworkManager[55763]: <info>  [1764351980.9734] device (tapd5f25ad5-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:46:20 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Nov 28 17:46:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:20.985 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c91e6869-84c7-420b-8563-a1bae77789f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.019 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f66565cc-0329-48de-951b-b715800b1c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 systemd-udevd[214175]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.025 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fd94127a-b6b2-447f-8aca-8b57c4a6f25b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 NetworkManager[55763]: <info>  [1764351981.0267] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.058 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3688b8-058f-4e2c-a900-6e2ada77af47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.062 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[063be20d-ed16-443c-9e58-877d54394850]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 NetworkManager[55763]: <info>  [1764351981.0906] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.098 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[583b7b81-7ca5-4699-a65d-ccc2fc30a43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.116 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[57c885f8-5cfa-49cc-ac02-56235ebd214c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515572, 'reachable_time': 22526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214203, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.134 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d25747-9c7f-48d8-9061-c4eb414507dd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515572, 'tstamp': 515572}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214204, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.153 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[15ebe273-0fe7-4eb9-9344-5b2e1e906084]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515572, 'reachable_time': 22526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214205, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.194 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[25ad5ae9-6ed7-43c5-bbcc-2b5d1ee91ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.276 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[87986f82-65a8-43a5-bbd1-6268154820b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.279 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.279 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.280 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.283 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:21 compute-0 NetworkManager[55763]: <info>  [1764351981.2845] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 28 17:46:21 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.287 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.289 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.291 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:21 compute-0 ovn_controller[95574]: 2025-11-28T17:46:21Z|00127|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.292 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.294 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.295 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cefbeb77-7417-4fcc-967a-fc1d0936b46d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.296 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:46:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:21.298 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.303 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.454 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351981.4530663, 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.454 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] VM Started (Lifecycle Event)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.479 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.483 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351981.4533868, 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.484 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] VM Paused (Lifecycle Event)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.521 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.526 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.549 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.626 187227 DEBUG nova.compute.manager [req-f65880c9-4c6a-4d9a-ae83-dc2c81e1a8ba req-8c43b077-d5ff-4142-8b1e-e78ee687d6f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.627 187227 DEBUG oslo_concurrency.lockutils [req-f65880c9-4c6a-4d9a-ae83-dc2c81e1a8ba req-8c43b077-d5ff-4142-8b1e-e78ee687d6f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.627 187227 DEBUG oslo_concurrency.lockutils [req-f65880c9-4c6a-4d9a-ae83-dc2c81e1a8ba req-8c43b077-d5ff-4142-8b1e-e78ee687d6f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.627 187227 DEBUG oslo_concurrency.lockutils [req-f65880c9-4c6a-4d9a-ae83-dc2c81e1a8ba req-8c43b077-d5ff-4142-8b1e-e78ee687d6f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.627 187227 DEBUG nova.compute.manager [req-f65880c9-4c6a-4d9a-ae83-dc2c81e1a8ba req-8c43b077-d5ff-4142-8b1e-e78ee687d6f9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Processing event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.628 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.631 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764351981.6307645, 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.631 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] VM Resumed (Lifecycle Event)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.632 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.635 187227 INFO nova.virt.libvirt.driver [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Instance spawned successfully.
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.636 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.653 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.659 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.662 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.662 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.663 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.663 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.663 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.664 187227 DEBUG nova.virt.libvirt.driver [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.695 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:46:21 compute-0 podman[214243]: 2025-11-28 17:46:21.708675105 +0000 UTC m=+0.056920871 container create c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.721 187227 INFO nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Took 9.01 seconds to spawn the instance on the hypervisor.
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.721 187227 DEBUG nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:46:21 compute-0 systemd[1]: Started libpod-conmon-c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368.scope.
Nov 28 17:46:21 compute-0 podman[214243]: 2025-11-28 17:46:21.680163288 +0000 UTC m=+0.028409064 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.781 187227 INFO nova.compute.manager [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Took 9.44 seconds to build instance.
Nov 28 17:46:21 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:46:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85e098c3d50641adcbf1f9a8f6c7fdc3a6d028f86ca4c918193534da870b6f39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:46:21 compute-0 podman[214243]: 2025-11-28 17:46:21.819944201 +0000 UTC m=+0.168189987 container init c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 17:46:21 compute-0 podman[214243]: 2025-11-28 17:46:21.827817949 +0000 UTC m=+0.176063705 container start c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 28 17:46:21 compute-0 nova_compute[187223]: 2025-11-28 17:46:21.843 187227 DEBUG oslo_concurrency.lockutils [None req-8c0a9fbf-147b-4a58-8508-135648f035fd 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:21 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [NOTICE]   (214262) : New worker (214264) forked
Nov 28 17:46:21 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [NOTICE]   (214262) : Loading success.
Nov 28 17:46:22 compute-0 nova_compute[187223]: 2025-11-28 17:46:22.358 187227 DEBUG nova.network.neutron [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updated VIF entry in instance network info cache for port d5f25ad5-b616-4614-9ff6-76e2a057dc48. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:46:22 compute-0 nova_compute[187223]: 2025-11-28 17:46:22.360 187227 DEBUG nova.network.neutron [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updating instance_info_cache with network_info: [{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:46:22 compute-0 nova_compute[187223]: 2025-11-28 17:46:22.387 187227 DEBUG oslo_concurrency.lockutils [req-f39c66df-5730-4984-8eec-0991599faf2f req-ec34f486-39b4-4d79-8d08-0b9401136dca 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:46:23 compute-0 podman[214273]: 2025-11-28 17:46:23.229389703 +0000 UTC m=+0.075711616 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.787 187227 DEBUG nova.compute.manager [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.788 187227 DEBUG oslo_concurrency.lockutils [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.788 187227 DEBUG oslo_concurrency.lockutils [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.788 187227 DEBUG oslo_concurrency.lockutils [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.788 187227 DEBUG nova.compute.manager [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] No waiting events found dispatching network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.789 187227 WARNING nova.compute.manager [req-cd534c9b-c460-499f-af88-92d928dc729c req-52a2b79a-cf40-4e74-9d8f-3e04870ac1a5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received unexpected event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 for instance with vm_state active and task_state None.
Nov 28 17:46:23 compute-0 nova_compute[187223]: 2025-11-28 17:46:23.926 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:24 compute-0 nova_compute[187223]: 2025-11-28 17:46:24.677 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:27.697 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:27.698 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:46:27.699 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:28 compute-0 nova_compute[187223]: 2025-11-28 17:46:28.929 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:29 compute-0 nova_compute[187223]: 2025-11-28 17:46:29.680 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:29 compute-0 podman[197556]: time="2025-11-28T17:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:46:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:46:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: ERROR   17:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: ERROR   17:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: ERROR   17:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: ERROR   17:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: ERROR   17:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:46:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:46:33 compute-0 ovn_controller[95574]: 2025-11-28T17:46:33Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:ea:77 10.100.0.12
Nov 28 17:46:33 compute-0 ovn_controller[95574]: 2025-11-28T17:46:33Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:ea:77 10.100.0.12
Nov 28 17:46:33 compute-0 nova_compute[187223]: 2025-11-28 17:46:33.932 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:34 compute-0 nova_compute[187223]: 2025-11-28 17:46:34.683 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:36 compute-0 nova_compute[187223]: 2025-11-28 17:46:36.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:36 compute-0 nova_compute[187223]: 2025-11-28 17:46:36.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:36 compute-0 nova_compute[187223]: 2025-11-28 17:46:36.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.718 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.823 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:38 compute-0 podman[214315]: 2025-11-28 17:46:38.84776342 +0000 UTC m=+0.066848969 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.900 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.903 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.934 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:38 compute-0 nova_compute[187223]: 2025-11-28 17:46:38.965 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.159 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.161 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.31182098388672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.161 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.161 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.233 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.234 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.234 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.267 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.284 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.284 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.297 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.318 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.360 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.381 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.406 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.407 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:46:39 compute-0 nova_compute[187223]: 2025-11-28 17:46:39.685 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:40 compute-0 nova_compute[187223]: 2025-11-28 17:46:40.406 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:40 compute-0 nova_compute[187223]: 2025-11-28 17:46:40.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:41 compute-0 nova_compute[187223]: 2025-11-28 17:46:41.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:41 compute-0 nova_compute[187223]: 2025-11-28 17:46:41.757 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:41 compute-0 nova_compute[187223]: 2025-11-28 17:46:41.758 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:46:41 compute-0 nova_compute[187223]: 2025-11-28 17:46:41.758 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:46:42 compute-0 nova_compute[187223]: 2025-11-28 17:46:42.048 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:46:42 compute-0 nova_compute[187223]: 2025-11-28 17:46:42.050 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:46:42 compute-0 nova_compute[187223]: 2025-11-28 17:46:42.050 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:46:42 compute-0 nova_compute[187223]: 2025-11-28 17:46:42.050 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:46:43 compute-0 nova_compute[187223]: 2025-11-28 17:46:43.685 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updating instance_info_cache with network_info: [{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:46:43 compute-0 nova_compute[187223]: 2025-11-28 17:46:43.704 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:46:43 compute-0 nova_compute[187223]: 2025-11-28 17:46:43.705 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:46:43 compute-0 nova_compute[187223]: 2025-11-28 17:46:43.705 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:43 compute-0 nova_compute[187223]: 2025-11-28 17:46:43.937 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:44 compute-0 nova_compute[187223]: 2025-11-28 17:46:44.688 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:45 compute-0 podman[214346]: 2025-11-28 17:46:45.231194863 +0000 UTC m=+0.078744874 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 17:46:45 compute-0 nova_compute[187223]: 2025-11-28 17:46:45.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:45 compute-0 nova_compute[187223]: 2025-11-28 17:46:45.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:46:48 compute-0 nova_compute[187223]: 2025-11-28 17:46:48.987 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:49 compute-0 nova_compute[187223]: 2025-11-28 17:46:49.691 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:51 compute-0 podman[214365]: 2025-11-28 17:46:51.24183173 +0000 UTC m=+0.097341173 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:46:51 compute-0 podman[214366]: 2025-11-28 17:46:51.31151917 +0000 UTC m=+0.160163994 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:46:53 compute-0 nova_compute[187223]: 2025-11-28 17:46:53.990 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:54 compute-0 podman[214411]: 2025-11-28 17:46:54.198157968 +0000 UTC m=+0.064183732 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 17:46:54 compute-0 nova_compute[187223]: 2025-11-28 17:46:54.695 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:58 compute-0 nova_compute[187223]: 2025-11-28 17:46:58.993 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:59 compute-0 nova_compute[187223]: 2025-11-28 17:46:59.699 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:46:59 compute-0 podman[197556]: time="2025-11-28T17:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:46:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:46:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3064 "" "Go-http-client/1.1"
Nov 28 17:47:00 compute-0 ovn_controller[95574]: 2025-11-28T17:47:00Z|00128|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Nov 28 17:47:01 compute-0 anacron[31145]: Job `cron.weekly' started
Nov 28 17:47:01 compute-0 anacron[31145]: Job `cron.weekly' terminated
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: ERROR   17:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: ERROR   17:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: ERROR   17:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: ERROR   17:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: ERROR   17:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:47:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:47:03 compute-0 sshd-session[214434]: Invalid user node from 193.32.162.146 port 52384
Nov 28 17:47:03 compute-0 sshd-session[214434]: Connection closed by invalid user node 193.32.162.146 port 52384 [preauth]
Nov 28 17:47:03 compute-0 nova_compute[187223]: 2025-11-28 17:47:03.996 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:04 compute-0 nova_compute[187223]: 2025-11-28 17:47:04.702 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:08 compute-0 nova_compute[187223]: 2025-11-28 17:47:08.999 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:09 compute-0 podman[214436]: 2025-11-28 17:47:09.238512078 +0000 UTC m=+0.095249432 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:47:09 compute-0 nova_compute[187223]: 2025-11-28 17:47:09.706 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:14 compute-0 nova_compute[187223]: 2025-11-28 17:47:14.001 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:14 compute-0 nova_compute[187223]: 2025-11-28 17:47:14.708 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:16 compute-0 podman[214462]: 2025-11-28 17:47:16.21202732 +0000 UTC m=+0.072047739 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:47:19 compute-0 nova_compute[187223]: 2025-11-28 17:47:19.004 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:19 compute-0 nova_compute[187223]: 2025-11-28 17:47:19.710 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:22 compute-0 podman[214496]: 2025-11-28 17:47:22.210058482 +0000 UTC m=+0.071828654 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 17:47:22 compute-0 nova_compute[187223]: 2025-11-28 17:47:22.263 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Creating tmpfile /var/lib/nova/instances/tmpcl6r84lk to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:47:22 compute-0 podman[214497]: 2025-11-28 17:47:22.278897557 +0000 UTC m=+0.135753096 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 17:47:22 compute-0 nova_compute[187223]: 2025-11-28 17:47:22.617 187227 DEBUG nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcl6r84lk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.006 187227 DEBUG nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcl6r84lk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9141f01f-f382-4cda-95cf-2445111a5096',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.008 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.044 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.045 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.045 187227 DEBUG nova.network.neutron [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:47:24 compute-0 nova_compute[187223]: 2025-11-28 17:47:24.711 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:25 compute-0 podman[214542]: 2025-11-28 17:47:25.248670465 +0000 UTC m=+0.094557133 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.313 187227 DEBUG nova.network.neutron [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Updating instance_info_cache with network_info: [{"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.332 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.335 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcl6r84lk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9141f01f-f382-4cda-95cf-2445111a5096',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.335 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Creating instance directory: /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.336 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Creating disk.info with the contents: {'/var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk': 'qcow2', '/var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.336 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.337 187227 DEBUG nova.objects.instance [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9141f01f-f382-4cda-95cf-2445111a5096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.363 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.421 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.422 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.423 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.434 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.489 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.491 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.528 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.531 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.532 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.590 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.591 187227 DEBUG nova.virt.disk.api [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.592 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.650 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.652 187227 DEBUG nova.virt.disk.api [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.652 187227 DEBUG nova.objects.instance [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 9141f01f-f382-4cda-95cf-2445111a5096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.665 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.688 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.690 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config to /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:47:26 compute-0 nova_compute[187223]: 2025-11-28 17:47:26.690 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.200 187227 DEBUG oslo_concurrency.processutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk.config /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.201 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.202 187227 DEBUG nova.virt.libvirt.vif [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:46:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-217199433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-217199433',id=16,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:46:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-0pbro1un',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:46:34Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=9141f01f-f382-4cda-95cf-2445111a5096,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.202 187227 DEBUG nova.network.os_vif_util [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.203 187227 DEBUG nova.network.os_vif_util [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.204 187227 DEBUG os_vif [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.204 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.205 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.205 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.210 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.210 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0156a04b-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.211 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0156a04b-7f, col_values=(('external_ids', {'iface-id': '0156a04b-7f6a-479f-a897-2eec2dd5b73a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:5a:ff', 'vm-uuid': '9141f01f-f382-4cda-95cf-2445111a5096'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.213 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:27 compute-0 NetworkManager[55763]: <info>  [1764352047.2150] manager: (tap0156a04b-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.215 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.221 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.222 187227 INFO os_vif [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f')
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.223 187227 DEBUG nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.223 187227 DEBUG nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcl6r84lk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9141f01f-f382-4cda-95cf-2445111a5096',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:47:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:27.699 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:27.701 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:27.702 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:27.843 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:47:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:27.844 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:47:27 compute-0 nova_compute[187223]: 2025-11-28 17:47:27.891 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:28 compute-0 nova_compute[187223]: 2025-11-28 17:47:28.534 187227 DEBUG nova.network.neutron [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Port 0156a04b-7f6a-479f-a897-2eec2dd5b73a updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:47:28 compute-0 nova_compute[187223]: 2025-11-28 17:47:28.536 187227 DEBUG nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcl6r84lk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9141f01f-f382-4cda-95cf-2445111a5096',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:47:28 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:47:28 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:47:28 compute-0 kernel: tap0156a04b-7f: entered promiscuous mode
Nov 28 17:47:28 compute-0 NetworkManager[55763]: <info>  [1764352048.9752] manager: (tap0156a04b-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Nov 28 17:47:28 compute-0 ovn_controller[95574]: 2025-11-28T17:47:28Z|00129|binding|INFO|Claiming lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a for this additional chassis.
Nov 28 17:47:28 compute-0 ovn_controller[95574]: 2025-11-28T17:47:28Z|00130|binding|INFO|0156a04b-7f6a-479f-a897-2eec2dd5b73a: Claiming fa:16:3e:46:5a:ff 10.100.0.10
Nov 28 17:47:28 compute-0 nova_compute[187223]: 2025-11-28 17:47:28.975 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:28 compute-0 ovn_controller[95574]: 2025-11-28T17:47:28Z|00131|binding|INFO|Setting lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a ovn-installed in OVS
Nov 28 17:47:28 compute-0 nova_compute[187223]: 2025-11-28 17:47:28.992 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:28 compute-0 nova_compute[187223]: 2025-11-28 17:47:28.994 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:29 compute-0 nova_compute[187223]: 2025-11-28 17:47:29.007 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:29 compute-0 systemd-machined[153517]: New machine qemu-12-instance-00000010.
Nov 28 17:47:29 compute-0 systemd-udevd[214619]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:47:29 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000010.
Nov 28 17:47:29 compute-0 NetworkManager[55763]: <info>  [1764352049.0564] device (tap0156a04b-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:47:29 compute-0 NetworkManager[55763]: <info>  [1764352049.0576] device (tap0156a04b-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:47:29 compute-0 nova_compute[187223]: 2025-11-28 17:47:29.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:29 compute-0 nova_compute[187223]: 2025-11-28 17:47:29.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:47:29 compute-0 nova_compute[187223]: 2025-11-28 17:47:29.714 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:47:29 compute-0 podman[197556]: time="2025-11-28T17:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:47:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:47:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.060 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352050.0596702, 9141f01f-f382-4cda-95cf-2445111a5096 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.061 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] VM Started (Lifecycle Event)
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.092 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.902 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352050.90217, 9141f01f-f382-4cda-95cf-2445111a5096 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.904 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] VM Resumed (Lifecycle Event)
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.923 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.929 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:47:30 compute-0 nova_compute[187223]: 2025-11-28 17:47:30.947 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: ERROR   17:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: ERROR   17:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: ERROR   17:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: ERROR   17:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: ERROR   17:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:47:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:47:31 compute-0 ovn_controller[95574]: 2025-11-28T17:47:31Z|00132|binding|INFO|Claiming lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a for this chassis.
Nov 28 17:47:31 compute-0 ovn_controller[95574]: 2025-11-28T17:47:31Z|00133|binding|INFO|0156a04b-7f6a-479f-a897-2eec2dd5b73a: Claiming fa:16:3e:46:5a:ff 10.100.0.10
Nov 28 17:47:31 compute-0 ovn_controller[95574]: 2025-11-28T17:47:31Z|00134|binding|INFO|Setting lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a up in Southbound
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.865 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:5a:ff 10.100.0.10'], port_security=['fa:16:3e:46:5a:ff 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9141f01f-f382-4cda-95cf-2445111a5096', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '11', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=0156a04b-7f6a-479f-a897-2eec2dd5b73a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.867 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 0156a04b-7f6a-479f-a897-2eec2dd5b73a in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.871 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.902 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4201547a-6d1c-493f-9619-32368c861b17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.944 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[045d936c-4545-4659-a1cc-e3e0111667e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.947 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[812b5b10-7f9e-48e7-9ee7-b0997ef50760]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:31.978 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3abdfa-f8ed-41a4-a9e7-9dedcc87501e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.005 187227 INFO nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Post operation of migration started
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.006 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9b02e3-9bc8-49b5-a739-08e1fa673238]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515572, 'reachable_time': 22526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214646, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.028 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[28e29d35-22ba-46d6-bf38-3255e1992b27]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515585, 'tstamp': 515585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214647, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515589, 'tstamp': 515589}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214647, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.031 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.033 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.034 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.035 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.035 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.035 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:32.036 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.213 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.461 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.462 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:47:32 compute-0 nova_compute[187223]: 2025-11-28 17:47:32.462 187227 DEBUG nova.network.neutron [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.011 187227 DEBUG nova.network.neutron [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Updating instance_info_cache with network_info: [{"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.014 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.034 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-9141f01f-f382-4cda-95cf-2445111a5096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.075 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.075 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.076 187227 DEBUG oslo_concurrency.lockutils [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:34 compute-0 nova_compute[187223]: 2025-11-28 17:47:34.082 187227 INFO nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:47:34 compute-0 virtqemud[186845]: Domain id=12 name='instance-00000010' uuid=9141f01f-f382-4cda-95cf-2445111a5096 is tainted: custom-monitor
Nov 28 17:47:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:34.847 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:35 compute-0 nova_compute[187223]: 2025-11-28 17:47:35.090 187227 INFO nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:47:36 compute-0 nova_compute[187223]: 2025-11-28 17:47:36.100 187227 INFO nova.virt.libvirt.driver [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:47:36 compute-0 nova_compute[187223]: 2025-11-28 17:47:36.108 187227 DEBUG nova.compute.manager [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:47:36 compute-0 nova_compute[187223]: 2025-11-28 17:47:36.144 187227 DEBUG nova.objects.instance [None req-c48f86bc-e939-444c-bddb-37ff2aa10883 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:47:36 compute-0 nova_compute[187223]: 2025-11-28 17:47:36.714 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:37 compute-0 nova_compute[187223]: 2025-11-28 17:47:37.216 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:37 compute-0 nova_compute[187223]: 2025-11-28 17:47:37.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:37 compute-0 nova_compute[187223]: 2025-11-28 17:47:37.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.124 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.728 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.730 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.730 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.731 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.851 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:39 compute-0 podman[214649]: 2025-11-28 17:47:39.881135109 +0000 UTC m=+0.081490060 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.947 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:39 compute-0 nova_compute[187223]: 2025-11-28 17:47:39.949 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.031 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.038 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.095 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.096 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.156 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.407 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.409 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.28280258178711GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.409 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.410 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.579 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.579 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 9141f01f-f382-4cda-95cf-2445111a5096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.580 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.580 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.702 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.717 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.736 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:47:40 compute-0 nova_compute[187223]: 2025-11-28 17:47:40.737 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.737 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.752 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "9141f01f-f382-4cda-95cf-2445111a5096" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.753 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.753 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "9141f01f-f382-4cda-95cf-2445111a5096-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.754 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.754 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.756 187227 INFO nova.compute.manager [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Terminating instance
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.758 187227 DEBUG nova.compute.manager [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:47:41 compute-0 kernel: tap0156a04b-7f (unregistering): left promiscuous mode
Nov 28 17:47:41 compute-0 NetworkManager[55763]: <info>  [1764352061.7865] device (tap0156a04b-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.797 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:41 compute-0 ovn_controller[95574]: 2025-11-28T17:47:41Z|00135|binding|INFO|Releasing lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a from this chassis (sb_readonly=0)
Nov 28 17:47:41 compute-0 ovn_controller[95574]: 2025-11-28T17:47:41Z|00136|binding|INFO|Setting lport 0156a04b-7f6a-479f-a897-2eec2dd5b73a down in Southbound
Nov 28 17:47:41 compute-0 ovn_controller[95574]: 2025-11-28T17:47:41Z|00137|binding|INFO|Removing iface tap0156a04b-7f ovn-installed in OVS
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.803 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.810 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:5a:ff 10.100.0.10'], port_security=['fa:16:3e:46:5a:ff 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9141f01f-f382-4cda-95cf-2445111a5096', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '13', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=0156a04b-7f6a-479f-a897-2eec2dd5b73a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.812 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 0156a04b-7f6a-479f-a897-2eec2dd5b73a in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.814 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.817 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.843 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[86b08bd6-cee7-4ea8-985e-e3fe33d99dad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 28 17:47:41 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Consumed 2.060s CPU time.
Nov 28 17:47:41 compute-0 systemd-machined[153517]: Machine qemu-12-instance-00000010 terminated.
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.882 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[729bdc7d-bb3b-42df-b79a-06e56bd911ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.886 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f0178fe9-dc8c-4d7a-8ca5-0b638ab8067d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.927 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b77f41e6-0cc9-4582-946a-9013add409e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.957 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d1cc9947-f7c0-46b1-b50e-24f21c1658d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515572, 'reachable_time': 22526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214699, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.976 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[03896ef7-3ddb-4833-82c3-169bbba57eb0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515585, 'tstamp': 515585}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214700, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515589, 'tstamp': 515589}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214700, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.978 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.980 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.990 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.989 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.990 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.990 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:41.991 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:47:41 compute-0 nova_compute[187223]: 2025-11-28 17:47:41.997 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.026 187227 DEBUG nova.compute.manager [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Received event network-vif-unplugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.027 187227 DEBUG oslo_concurrency.lockutils [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "9141f01f-f382-4cda-95cf-2445111a5096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.027 187227 DEBUG oslo_concurrency.lockutils [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.027 187227 DEBUG oslo_concurrency.lockutils [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.028 187227 DEBUG nova.compute.manager [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] No waiting events found dispatching network-vif-unplugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.028 187227 DEBUG nova.compute.manager [req-0b23ed38-e90e-4b3c-8e4a-a6e0ff2f9083 req-9055c1a6-39b6-48fa-ac53-3f8daa49bc70 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Received event network-vif-unplugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.052 187227 INFO nova.virt.libvirt.driver [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Instance destroyed successfully.
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.052 187227 DEBUG nova.objects.instance [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'resources' on Instance uuid 9141f01f-f382-4cda-95cf-2445111a5096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.080 187227 DEBUG nova.virt.libvirt.vif [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:46:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-217199433',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-217199433',id=16,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:46:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-0pbro1un',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:47:36Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=9141f01f-f382-4cda-95cf-2445111a5096,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.080 187227 DEBUG nova.network.os_vif_util [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "address": "fa:16:3e:46:5a:ff", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0156a04b-7f", "ovs_interfaceid": "0156a04b-7f6a-479f-a897-2eec2dd5b73a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.081 187227 DEBUG nova.network.os_vif_util [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.082 187227 DEBUG os_vif [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.085 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.085 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0156a04b-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.088 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.090 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.094 187227 INFO os_vif [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5a:ff,bridge_name='br-int',has_traffic_filtering=True,id=0156a04b-7f6a-479f-a897-2eec2dd5b73a,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0156a04b-7f')
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.095 187227 INFO nova.virt.libvirt.driver [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Deleting instance files /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096_del
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.096 187227 INFO nova.virt.libvirt.driver [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Deletion of /var/lib/nova/instances/9141f01f-f382-4cda-95cf-2445111a5096_del complete
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.181 187227 INFO nova.compute.manager [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Took 0.42 seconds to destroy the instance on the hypervisor.
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.182 187227 DEBUG oslo.service.loopingcall [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.182 187227 DEBUG nova.compute.manager [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.183 187227 DEBUG nova.network.neutron [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:47:42 compute-0 nova_compute[187223]: 2025-11-28 17:47:42.873 187227 DEBUG nova.network.neutron [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.093 187227 INFO nova.compute.manager [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Took 0.91 seconds to deallocate network for instance.
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.173 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.173 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.254 187227 DEBUG nova.compute.provider_tree [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.271 187227 DEBUG nova.scheduler.client.report [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.295 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.323 187227 INFO nova.scheduler.client.report [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Deleted allocations for instance 9141f01f-f382-4cda-95cf-2445111a5096
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.379 187227 DEBUG oslo_concurrency.lockutils [None req-a805da69-e124-43db-95fd-f5af4a6aa532 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.938 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.939 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.939 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.940 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.940 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.941 187227 INFO nova.compute.manager [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Terminating instance
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.942 187227 DEBUG nova.compute.manager [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.951 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.951 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.951 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.951 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:47:43 compute-0 kernel: tapd5f25ad5-b6 (unregistering): left promiscuous mode
Nov 28 17:47:43 compute-0 NetworkManager[55763]: <info>  [1764352063.9715] device (tapd5f25ad5-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:47:43 compute-0 ovn_controller[95574]: 2025-11-28T17:47:43Z|00138|binding|INFO|Releasing lport d5f25ad5-b616-4614-9ff6-76e2a057dc48 from this chassis (sb_readonly=0)
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.980 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:43 compute-0 ovn_controller[95574]: 2025-11-28T17:47:43Z|00139|binding|INFO|Setting lport d5f25ad5-b616-4614-9ff6-76e2a057dc48 down in Southbound
Nov 28 17:47:43 compute-0 ovn_controller[95574]: 2025-11-28T17:47:43Z|00140|binding|INFO|Removing iface tapd5f25ad5-b6 ovn-installed in OVS
Nov 28 17:47:43 compute-0 nova_compute[187223]: 2025-11-28 17:47:43.985 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:43.989 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ea:77 10.100.0.12'], port_security=['fa:16:3e:11:ea:77 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3a554ced-b6a5-4a0f-b573-c1e3c6cf8382', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=d5f25ad5-b616-4614-9ff6-76e2a057dc48) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:47:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:43.990 104433 INFO neutron.agent.ovn.metadata.agent [-] Port d5f25ad5-b616-4614-9ff6-76e2a057dc48 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:47:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:43.991 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:47:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:43.993 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf975c9-d8d6-4525-9264-fca5dda56021]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:43.993 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.008 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 28 17:47:44 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 16.514s CPU time.
Nov 28 17:47:44 compute-0 systemd-machined[153517]: Machine qemu-11-instance-0000000f terminated.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.126 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [NOTICE]   (214262) : haproxy version is 2.8.14-c23fe91
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [NOTICE]   (214262) : path to executable is /usr/sbin/haproxy
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [WARNING]  (214262) : Exiting Master process...
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [WARNING]  (214262) : Exiting Master process...
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [ALERT]    (214262) : Current worker (214264) exited with code 143 (Terminated)
Nov 28 17:47:44 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[214258]: [WARNING]  (214262) : All workers exited. Exiting... (0)
Nov 28 17:47:44 compute-0 NetworkManager[55763]: <info>  [1764352064.1684] manager: (tapd5f25ad5-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Nov 28 17:47:44 compute-0 systemd[1]: libpod-c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368.scope: Deactivated successfully.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.170 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 podman[214740]: 2025-11-28 17:47:44.172388639 +0000 UTC m=+0.051055442 container died c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:47:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368-userdata-shm.mount: Deactivated successfully.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.211 187227 INFO nova.virt.libvirt.driver [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Instance destroyed successfully.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.212 187227 DEBUG nova.objects.instance [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'resources' on Instance uuid 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:47:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-85e098c3d50641adcbf1f9a8f6c7fdc3a6d028f86ca4c918193534da870b6f39-merged.mount: Deactivated successfully.
Nov 28 17:47:44 compute-0 podman[214740]: 2025-11-28 17:47:44.224843037 +0000 UTC m=+0.103509860 container cleanup c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:47:44 compute-0 systemd[1]: libpod-conmon-c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368.scope: Deactivated successfully.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.283 187227 DEBUG nova.virt.libvirt.vif [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:46:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1838714708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1838714708',id=15,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:46:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-qlm8oug8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:46:21Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3a554ced-b6a5-4a0f-b573-c1e3c6cf8382,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.283 187227 DEBUG nova.network.os_vif_util [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.284 187227 DEBUG nova.network.os_vif_util [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.284 187227 DEBUG os_vif [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.286 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.286 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f25ad5-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.288 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.290 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.292 187227 INFO os_vif [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ea:77,bridge_name='br-int',has_traffic_filtering=True,id=d5f25ad5-b616-4614-9ff6-76e2a057dc48,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5f25ad5-b6')
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.293 187227 INFO nova.virt.libvirt.driver [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Deleting instance files /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382_del
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.294 187227 INFO nova.virt.libvirt.driver [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Deletion of /var/lib/nova/instances/3a554ced-b6a5-4a0f-b573-c1e3c6cf8382_del complete
Nov 28 17:47:44 compute-0 podman[214785]: 2025-11-28 17:47:44.29797766 +0000 UTC m=+0.045477028 container remove c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.302 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[dd38b404-d685-48dc-8199-1eb9f017952e]: (4, ('Fri Nov 28 05:47:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368)\nc24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368\nFri Nov 28 05:47:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (c24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368)\nc24a28bde37c67e1e9638390dcc8fb6d8ba366cfcad98fde6d5d0ac95645b368\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.304 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d4779b4c-2f07-44e3-853a-773817c5a067]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.305 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.306 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.319 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.322 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1f34ae4f-f4d0-4c1c-aad1-e299c6a95929]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.342 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd9ce4a-fda3-4434-a34a-01e238aa84e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.344 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c67924be-321b-45b2-8d6b-288a83b2c6ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.363 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[46466088-957f-4726-b28d-81774b803de4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515564, 'reachable_time': 17884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214800, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.368 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:47:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:47:44.369 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[44f225ae-7355-43eb-b0d3-cf4b1942e34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.543 187227 INFO nova.compute.manager [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Took 0.60 seconds to destroy the instance on the hypervisor.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.544 187227 DEBUG oslo.service.loopingcall [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.545 187227 DEBUG nova.compute.manager [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.545 187227 DEBUG nova.network.neutron [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.568 187227 DEBUG nova.compute.manager [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Received event network-vif-plugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.568 187227 DEBUG oslo_concurrency.lockutils [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "9141f01f-f382-4cda-95cf-2445111a5096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.569 187227 DEBUG oslo_concurrency.lockutils [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.569 187227 DEBUG oslo_concurrency.lockutils [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "9141f01f-f382-4cda-95cf-2445111a5096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.569 187227 DEBUG nova.compute.manager [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] No waiting events found dispatching network-vif-plugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.569 187227 WARNING nova.compute.manager [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Received unexpected event network-vif-plugged-0156a04b-7f6a-479f-a897-2eec2dd5b73a for instance with vm_state deleted and task_state None.
Nov 28 17:47:44 compute-0 nova_compute[187223]: 2025-11-28 17:47:44.570 187227 DEBUG nova.compute.manager [req-2a97ab42-600e-411d-a82a-4e7f6db52c41 req-e3b9b7ac-e030-4d93-a85a-528e60ee84f7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Received event network-vif-deleted-0156a04b-7f6a-479f-a897-2eec2dd5b73a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.853 187227 DEBUG nova.network.neutron [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.870 187227 INFO nova.compute.manager [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Took 1.33 seconds to deallocate network for instance.
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.911 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.911 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.961 187227 DEBUG nova.compute.provider_tree [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.977 187227 DEBUG nova.scheduler.client.report [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:47:45 compute-0 nova_compute[187223]: 2025-11-28 17:47:45.997 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.028 187227 INFO nova.scheduler.client.report [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Deleted allocations for instance 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.116 187227 DEBUG oslo_concurrency.lockutils [None req-ddb9b060-43b7-4bee-918e-15649eb76bf2 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.676 187227 DEBUG nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-vif-unplugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.677 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.678 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.678 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.679 187227 DEBUG nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] No waiting events found dispatching network-vif-unplugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.679 187227 WARNING nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received unexpected event network-vif-unplugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 for instance with vm_state deleted and task_state None.
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.679 187227 DEBUG nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.680 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.680 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.680 187227 DEBUG oslo_concurrency.lockutils [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a554ced-b6a5-4a0f-b573-c1e3c6cf8382-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.681 187227 DEBUG nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] No waiting events found dispatching network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.681 187227 WARNING nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received unexpected event network-vif-plugged-d5f25ad5-b616-4614-9ff6-76e2a057dc48 for instance with vm_state deleted and task_state None.
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.682 187227 DEBUG nova.compute.manager [req-8fc5d2e1-690e-4650-87c3-424297690b9e req-caffe61e-ea83-42f9-b6b5-e1f231facaec 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Received event network-vif-deleted-d5f25ad5-b616-4614-9ff6-76e2a057dc48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.716 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updating instance_info_cache with network_info: [{"id": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "address": "fa:16:3e:11:ea:77", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5f25ad5-b6", "ovs_interfaceid": "d5f25ad5-b616-4614-9ff6-76e2a057dc48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.741 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-3a554ced-b6a5-4a0f-b573-c1e3c6cf8382" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.741 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.742 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:46 compute-0 nova_compute[187223]: 2025-11-28 17:47:46.742 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:47 compute-0 podman[214801]: 2025-11-28 17:47:47.223057988 +0000 UTC m=+0.082342562 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:47:47 compute-0 nova_compute[187223]: 2025-11-28 17:47:47.697 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:47 compute-0 nova_compute[187223]: 2025-11-28 17:47:47.698 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:49 compute-0 nova_compute[187223]: 2025-11-28 17:47:49.128 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:49 compute-0 nova_compute[187223]: 2025-11-28 17:47:49.288 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:53 compute-0 podman[214821]: 2025-11-28 17:47:53.199672454 +0000 UTC m=+0.058367342 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:47:53 compute-0 podman[214822]: 2025-11-28 17:47:53.25630509 +0000 UTC m=+0.117605845 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:47:53 compute-0 nova_compute[187223]: 2025-11-28 17:47:53.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:47:53 compute-0 nova_compute[187223]: 2025-11-28 17:47:53.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:47:54 compute-0 nova_compute[187223]: 2025-11-28 17:47:54.158 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:54 compute-0 nova_compute[187223]: 2025-11-28 17:47:54.289 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:56 compute-0 podman[214867]: 2025-11-28 17:47:56.245927257 +0000 UTC m=+0.096702854 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public)
Nov 28 17:47:57 compute-0 nova_compute[187223]: 2025-11-28 17:47:57.049 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352062.0474637, 9141f01f-f382-4cda-95cf-2445111a5096 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:47:57 compute-0 nova_compute[187223]: 2025-11-28 17:47:57.049 187227 INFO nova.compute.manager [-] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] VM Stopped (Lifecycle Event)
Nov 28 17:47:57 compute-0 nova_compute[187223]: 2025-11-28 17:47:57.085 187227 DEBUG nova.compute.manager [None req-497afc45-9906-47c4-be41-a538a5ee89ea - - - - - -] [instance: 9141f01f-f382-4cda-95cf-2445111a5096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:47:59 compute-0 nova_compute[187223]: 2025-11-28 17:47:59.197 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:59 compute-0 nova_compute[187223]: 2025-11-28 17:47:59.210 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352064.2090917, 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:47:59 compute-0 nova_compute[187223]: 2025-11-28 17:47:59.210 187227 INFO nova.compute.manager [-] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] VM Stopped (Lifecycle Event)
Nov 28 17:47:59 compute-0 nova_compute[187223]: 2025-11-28 17:47:59.264 187227 DEBUG nova.compute.manager [None req-7692e748-4b22-4641-a68a-818c8d864b05 - - - - - -] [instance: 3a554ced-b6a5-4a0f-b573-c1e3c6cf8382] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:47:59 compute-0 nova_compute[187223]: 2025-11-28 17:47:59.292 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:47:59 compute-0 podman[197556]: time="2025-11-28T17:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:47:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:47:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: ERROR   17:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: ERROR   17:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: ERROR   17:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: ERROR   17:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: ERROR   17:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:48:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:48:04 compute-0 nova_compute[187223]: 2025-11-28 17:48:04.199 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:04 compute-0 nova_compute[187223]: 2025-11-28 17:48:04.295 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:09 compute-0 nova_compute[187223]: 2025-11-28 17:48:09.201 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:09 compute-0 nova_compute[187223]: 2025-11-28 17:48:09.297 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:10 compute-0 podman[214888]: 2025-11-28 17:48:10.202936409 +0000 UTC m=+0.054287376 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:48:14 compute-0 nova_compute[187223]: 2025-11-28 17:48:14.202 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:14 compute-0 nova_compute[187223]: 2025-11-28 17:48:14.298 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:18 compute-0 podman[214912]: 2025-11-28 17:48:18.191708041 +0000 UTC m=+0.050140878 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 17:48:19 compute-0 nova_compute[187223]: 2025-11-28 17:48:19.207 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:19 compute-0 nova_compute[187223]: 2025-11-28 17:48:19.299 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:24 compute-0 podman[214932]: 2025-11-28 17:48:24.207697687 +0000 UTC m=+0.065449335 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:48:24 compute-0 nova_compute[187223]: 2025-11-28 17:48:24.208 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:24 compute-0 podman[214933]: 2025-11-28 17:48:24.232673974 +0000 UTC m=+0.080008282 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 17:48:24 compute-0 nova_compute[187223]: 2025-11-28 17:48:24.302 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:27 compute-0 podman[214978]: 2025-11-28 17:48:27.189735388 +0000 UTC m=+0.049033300 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 28 17:48:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:27.700 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:27.700 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:27.701 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:29 compute-0 nova_compute[187223]: 2025-11-28 17:48:29.210 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:29 compute-0 nova_compute[187223]: 2025-11-28 17:48:29.304 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:29 compute-0 podman[197556]: time="2025-11-28T17:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:48:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:48:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: ERROR   17:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: ERROR   17:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: ERROR   17:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: ERROR   17:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: ERROR   17:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:48:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.351 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.351 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.376 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.465 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.465 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.474 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.475 187227 INFO nova.compute.claims [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.842 187227 DEBUG nova.compute.provider_tree [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.865 187227 DEBUG nova.scheduler.client.report [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.903 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.904 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.966 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.966 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:48:32 compute-0 nova_compute[187223]: 2025-11-28 17:48:32.986 187227 INFO nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.006 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.124 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.126 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.126 187227 INFO nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Creating image(s)
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.127 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.127 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.128 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.140 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.236 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.238 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.238 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.251 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.321 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.323 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.366 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.368 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.369 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.422 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.424 187227 DEBUG nova.virt.disk.api [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.426 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.510 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.512 187227 DEBUG nova.virt.disk.api [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.512 187227 DEBUG nova.objects.instance [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.532 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.532 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Ensure instance console log exists: /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.533 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.533 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.534 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:33 compute-0 nova_compute[187223]: 2025-11-28 17:48:33.640 187227 DEBUG nova.policy [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:48:34 compute-0 nova_compute[187223]: 2025-11-28 17:48:34.255 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:34 compute-0 nova_compute[187223]: 2025-11-28 17:48:34.306 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:34 compute-0 nova_compute[187223]: 2025-11-28 17:48:34.678 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:34.678 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:48:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:34.681 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:48:34 compute-0 nova_compute[187223]: 2025-11-28 17:48:34.778 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Successfully created port: 9a913108-e2f3-42e3-92f5-51da2a1f5354 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:48:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:35.686 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.675 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Successfully updated port: 9a913108-e2f3-42e3-92f5-51da2a1f5354 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.700 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.701 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.701 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.816 187227 DEBUG nova.compute.manager [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-changed-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.816 187227 DEBUG nova.compute.manager [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Refreshing instance network info cache due to event network-changed-9a913108-e2f3-42e3-92f5-51da2a1f5354. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.816 187227 DEBUG oslo_concurrency.lockutils [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:48:36 compute-0 nova_compute[187223]: 2025-11-28 17:48:36.895 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.092 187227 DEBUG nova.network.neutron [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updating instance_info_cache with network_info: [{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.119 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.120 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Instance network_info: |[{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.120 187227 DEBUG oslo_concurrency.lockutils [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.120 187227 DEBUG nova.network.neutron [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Refreshing network info cache for port 9a913108-e2f3-42e3-92f5-51da2a1f5354 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.124 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Start _get_guest_xml network_info=[{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.129 187227 WARNING nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.141 187227 DEBUG nova.virt.libvirt.host [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.142 187227 DEBUG nova.virt.libvirt.host [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.148 187227 DEBUG nova.virt.libvirt.host [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.149 187227 DEBUG nova.virt.libvirt.host [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.150 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.151 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.151 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.151 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.152 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.152 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.152 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.152 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.152 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.153 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.153 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.153 187227 DEBUG nova.virt.hardware [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.156 187227 DEBUG nova.virt.libvirt.vif [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:48:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-23084288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-23084288',id=18,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-rqnvb0oe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:48:33Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.157 187227 DEBUG nova.network.os_vif_util [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.157 187227 DEBUG nova.network.os_vif_util [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.158 187227 DEBUG nova.objects.instance [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.177 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <uuid>f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a</uuid>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <name>instance-00000012</name>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-23084288</nova:name>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:48:38</nova:creationTime>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         <nova:port uuid="9a913108-e2f3-42e3-92f5-51da2a1f5354">
Nov 28 17:48:38 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <system>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="serial">f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="uuid">f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </system>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <os>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </os>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <features>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </features>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.config"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:1e:4f:32"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <target dev="tap9a913108-e2"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/console.log" append="off"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <video>
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </video>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:48:38 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:48:38 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:48:38 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:48:38 compute-0 nova_compute[187223]: </domain>
Nov 28 17:48:38 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.179 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Preparing to wait for external event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.179 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.179 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.180 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.180 187227 DEBUG nova.virt.libvirt.vif [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:48:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-23084288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-23084288',id=18,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-rqnvb0oe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:48:33Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.181 187227 DEBUG nova.network.os_vif_util [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.181 187227 DEBUG nova.network.os_vif_util [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.182 187227 DEBUG os_vif [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.182 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.182 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.183 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.188 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.188 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a913108-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.189 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a913108-e2, col_values=(('external_ids', {'iface-id': '9a913108-e2f3-42e3-92f5-51da2a1f5354', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:4f:32', 'vm-uuid': 'f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.192 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:48:38 compute-0 NetworkManager[55763]: <info>  [1764352118.1960] manager: (tap9a913108-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.202 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.203 187227 INFO os_vif [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2')
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.261 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.261 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.261 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:1e:4f:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.262 187227 INFO nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Using config drive
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.876 187227 INFO nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Creating config drive at /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.config
Nov 28 17:48:38 compute-0 nova_compute[187223]: 2025-11-28 17:48:38.887 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mjbmkpx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.017 187227 DEBUG oslo_concurrency.processutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4mjbmkpx" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:39 compute-0 kernel: tap9a913108-e2: entered promiscuous mode
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.1107] manager: (tap9a913108-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.149 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 ovn_controller[95574]: 2025-11-28T17:48:39Z|00141|binding|INFO|Claiming lport 9a913108-e2f3-42e3-92f5-51da2a1f5354 for this chassis.
Nov 28 17:48:39 compute-0 ovn_controller[95574]: 2025-11-28T17:48:39Z|00142|binding|INFO|9a913108-e2f3-42e3-92f5-51da2a1f5354: Claiming fa:16:3e:1e:4f:32 10.100.0.12
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.158 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:4f:32 10.100.0.12'], port_security=['fa:16:3e:1e:4f:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=9a913108-e2f3-42e3-92f5-51da2a1f5354) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.161 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 9a913108-e2f3-42e3-92f5-51da2a1f5354 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.162 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:48:39 compute-0 ovn_controller[95574]: 2025-11-28T17:48:39Z|00143|binding|INFO|Setting lport 9a913108-e2f3-42e3-92f5-51da2a1f5354 up in Southbound
Nov 28 17:48:39 compute-0 ovn_controller[95574]: 2025-11-28T17:48:39Z|00144|binding|INFO|Setting lport 9a913108-e2f3-42e3-92f5-51da2a1f5354 ovn-installed in OVS
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.165 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.169 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.174 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2707beb3-f22f-4d35-8411-24fd4d45540e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.176 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.177 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.178 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[00ca9626-8d0f-41fe-b41b-2af956a21d8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.179 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1614a7-556e-4f70-a4a9-05b1413afbc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 systemd-machined[153517]: New machine qemu-13-instance-00000012.
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.192 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[df82b20b-b6a4-43a7-ab4b-56d05e48efa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000012.
Nov 28 17:48:39 compute-0 systemd-udevd[215037]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.208 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2a25dc4e-b99a-4152-96ee-faef1852519e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.2182] device (tap9a913108-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.2194] device (tap9a913108-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.233 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f46c173d-94b4-471b-8264-960ccae20882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.240 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1b15b0c8-74f6-46a4-9a40-f0838b4a3a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.2417] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.256 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.280 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc514d0-02b7-4ab3-87d0-e05d8e771427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.284 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[505be151-d69e-4f37-bc3e-3796840b439f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.3056] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.310 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c80f9696-e8f3-418d-bfac-f2cfc29ededc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.325 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[02e998e8-4777-4fa4-92ec-2b39019e3a9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529393, 'reachable_time': 20935, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215067, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.339 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6b5811-b85e-4935-9409-8f2280e44e3f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529393, 'tstamp': 529393}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215068, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.356 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5113f50b-a80c-47ef-be92-761ef3e7eda3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529393, 'reachable_time': 20935, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215069, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.385 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[55a102e2-b059-4727-9f32-1e84a7a14c1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.443 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cf01d24b-71c3-481a-bfe7-ae103552041f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.446 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.446 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.447 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:39 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:48:39 compute-0 NetworkManager[55763]: <info>  [1764352119.4525] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.453 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.454 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:48:39 compute-0 ovn_controller[95574]: 2025-11-28T17:48:39Z|00145|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.456 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.457 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.458 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[288d961c-0fa4-48ab-911f-e61f08ea86a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.459 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:48:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:48:39.460 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.468 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.574 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352119.5732956, f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.574 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] VM Started (Lifecycle Event)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.602 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.606 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352119.5734534, f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.606 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] VM Paused (Lifecycle Event)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.623 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.626 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.652 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.752 187227 DEBUG nova.compute.manager [req-4afb733d-2b8b-4fa0-8c72-ff18d3539eca req-99338efd-017d-4ec3-9020-966e3849cfd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.752 187227 DEBUG oslo_concurrency.lockutils [req-4afb733d-2b8b-4fa0-8c72-ff18d3539eca req-99338efd-017d-4ec3-9020-966e3849cfd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.753 187227 DEBUG oslo_concurrency.lockutils [req-4afb733d-2b8b-4fa0-8c72-ff18d3539eca req-99338efd-017d-4ec3-9020-966e3849cfd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.753 187227 DEBUG oslo_concurrency.lockutils [req-4afb733d-2b8b-4fa0-8c72-ff18d3539eca req-99338efd-017d-4ec3-9020-966e3849cfd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.753 187227 DEBUG nova.compute.manager [req-4afb733d-2b8b-4fa0-8c72-ff18d3539eca req-99338efd-017d-4ec3-9020-966e3849cfd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Processing event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.753 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.758 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352119.7565691, f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.758 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] VM Resumed (Lifecycle Event)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.759 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.763 187227 INFO nova.virt.libvirt.driver [-] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Instance spawned successfully.
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.763 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.792 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.794 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.813 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.813 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.814 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.814 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.814 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.815 187227 DEBUG nova.virt.libvirt.driver [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.826 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:48:39 compute-0 podman[215108]: 2025-11-28 17:48:39.870621737 +0000 UTC m=+0.058480085 container create db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.903 187227 INFO nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Took 6.78 seconds to spawn the instance on the hypervisor.
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.904 187227 DEBUG nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:48:39 compute-0 systemd[1]: Started libpod-conmon-db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f.scope.
Nov 28 17:48:39 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:48:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9498ee30e2e0545a1b335bd5ddcfc909449b5e41595a7df5fa2603f0dcbbc71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:48:39 compute-0 podman[215108]: 2025-11-28 17:48:39.837762716 +0000 UTC m=+0.025621094 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:48:39 compute-0 podman[215108]: 2025-11-28 17:48:39.944371966 +0000 UTC m=+0.132230334 container init db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 28 17:48:39 compute-0 podman[215108]: 2025-11-28 17:48:39.95108562 +0000 UTC m=+0.138943968 container start db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 17:48:39 compute-0 nova_compute[187223]: 2025-11-28 17:48:39.964 187227 INFO nova.compute.manager [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Took 7.54 seconds to build instance.
Nov 28 17:48:39 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [NOTICE]   (215128) : New worker (215130) forked
Nov 28 17:48:39 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [NOTICE]   (215128) : Loading success.
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.010 187227 DEBUG oslo_concurrency.lockutils [None req-dfc3f167-4bfa-49bf-a04d-f257a102a44b 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.092 187227 DEBUG nova.network.neutron [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updated VIF entry in instance network info cache for port 9a913108-e2f3-42e3-92f5-51da2a1f5354. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.092 187227 DEBUG nova.network.neutron [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updating instance_info_cache with network_info: [{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.121 187227 DEBUG oslo_concurrency.lockutils [req-438def8f-d855-4b37-bf72-ca2f9756c206 req-fd83937a-625b-43bd-865d-bb561b28c848 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.710 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.803 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:40 compute-0 podman[215140]: 2025-11-28 17:48:40.810556048 +0000 UTC m=+0.057395137 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.860 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.861 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:48:40 compute-0 nova_compute[187223]: 2025-11-28 17:48:40.914 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.107 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.108 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5759MB free_disk=73.33958435058594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.108 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.109 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.203 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.204 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.204 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.244 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.257 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.287 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.287 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.898 187227 DEBUG nova.compute.manager [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.899 187227 DEBUG oslo_concurrency.lockutils [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.900 187227 DEBUG oslo_concurrency.lockutils [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.900 187227 DEBUG oslo_concurrency.lockutils [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.901 187227 DEBUG nova.compute.manager [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:48:41 compute-0 nova_compute[187223]: 2025-11-28 17:48:41.901 187227 WARNING nova.compute.manager [req-6192d6a6-7b42-4b9e-93f7-cd68a0e3b815 req-763280e5-f933-4712-9816-6e362feefed5 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state None.
Nov 28 17:48:42 compute-0 nova_compute[187223]: 2025-11-28 17:48:42.283 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.193 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.891 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.892 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.892 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:48:43 compute-0 nova_compute[187223]: 2025-11-28 17:48:43.893 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:48:44 compute-0 nova_compute[187223]: 2025-11-28 17:48:44.259 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:45 compute-0 nova_compute[187223]: 2025-11-28 17:48:45.430 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updating instance_info_cache with network_info: [{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:48:45 compute-0 nova_compute[187223]: 2025-11-28 17:48:45.464 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:48:45 compute-0 nova_compute[187223]: 2025-11-28 17:48:45.465 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:48:47 compute-0 nova_compute[187223]: 2025-11-28 17:48:47.459 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:47 compute-0 nova_compute[187223]: 2025-11-28 17:48:47.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:48 compute-0 nova_compute[187223]: 2025-11-28 17:48:48.223 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:49 compute-0 podman[215169]: 2025-11-28 17:48:49.238276542 +0000 UTC m=+0.078072422 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:48:49 compute-0 nova_compute[187223]: 2025-11-28 17:48:49.303 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:49 compute-0 nova_compute[187223]: 2025-11-28 17:48:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:48:52 compute-0 ovn_controller[95574]: 2025-11-28T17:48:52Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:4f:32 10.100.0.12
Nov 28 17:48:52 compute-0 ovn_controller[95574]: 2025-11-28T17:48:52Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:4f:32 10.100.0.12
Nov 28 17:48:53 compute-0 nova_compute[187223]: 2025-11-28 17:48:53.227 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:54 compute-0 nova_compute[187223]: 2025-11-28 17:48:54.309 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:55 compute-0 podman[215206]: 2025-11-28 17:48:55.247055892 +0000 UTC m=+0.087599518 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd)
Nov 28 17:48:55 compute-0 podman[215207]: 2025-11-28 17:48:55.291900453 +0000 UTC m=+0.134918674 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:48:58 compute-0 podman[215254]: 2025-11-28 17:48:58.212116224 +0000 UTC m=+0.073433152 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 17:48:58 compute-0 nova_compute[187223]: 2025-11-28 17:48:58.230 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:59 compute-0 nova_compute[187223]: 2025-11-28 17:48:59.312 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:48:59 compute-0 podman[197556]: time="2025-11-28T17:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:48:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18322 "" "Go-http-client/1.1"
Nov 28 17:48:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: ERROR   17:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: ERROR   17:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: ERROR   17:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: ERROR   17:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: ERROR   17:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:49:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:49:03 compute-0 nova_compute[187223]: 2025-11-28 17:49:03.234 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:04 compute-0 nova_compute[187223]: 2025-11-28 17:49:04.316 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:08 compute-0 nova_compute[187223]: 2025-11-28 17:49:08.237 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:09 compute-0 ovn_controller[95574]: 2025-11-28T17:49:09Z|00146|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 28 17:49:09 compute-0 nova_compute[187223]: 2025-11-28 17:49:09.318 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:11 compute-0 podman[215276]: 2025-11-28 17:49:11.202912835 +0000 UTC m=+0.064226773 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:49:13 compute-0 nova_compute[187223]: 2025-11-28 17:49:13.239 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:14 compute-0 nova_compute[187223]: 2025-11-28 17:49:14.320 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:17 compute-0 nova_compute[187223]: 2025-11-28 17:49:17.976 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Check if temp file /var/lib/nova/instances/tmph7koyegb exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 17:49:17 compute-0 nova_compute[187223]: 2025-11-28 17:49:17.976 187227 DEBUG nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7koyegb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 17:49:18 compute-0 nova_compute[187223]: 2025-11-28 17:49:18.242 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:19 compute-0 nova_compute[187223]: 2025-11-28 17:49:19.323 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:20 compute-0 podman[215300]: 2025-11-28 17:49:20.200927532 +0000 UTC m=+0.062856188 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:49:22 compute-0 nova_compute[187223]: 2025-11-28 17:49:22.185 187227 DEBUG oslo_concurrency.processutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:49:22 compute-0 nova_compute[187223]: 2025-11-28 17:49:22.260 187227 DEBUG oslo_concurrency.processutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:49:22 compute-0 nova_compute[187223]: 2025-11-28 17:49:22.262 187227 DEBUG oslo_concurrency.processutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:49:22 compute-0 nova_compute[187223]: 2025-11-28 17:49:22.320 187227 DEBUG oslo_concurrency.processutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:49:23 compute-0 nova_compute[187223]: 2025-11-28 17:49:23.244 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:24 compute-0 nova_compute[187223]: 2025-11-28 17:49:24.323 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:26 compute-0 podman[215327]: 2025-11-28 17:49:26.218718695 +0000 UTC m=+0.076143382 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:49:26 compute-0 podman[215326]: 2025-11-28 17:49:26.230698995 +0000 UTC m=+0.089655102 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 17:49:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:27.702 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:27.704 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:27.705 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:27 compute-0 sshd-session[215370]: Accepted publickey for nova from 192.168.122.101 port 55632 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:49:27 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:49:27 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:49:27 compute-0 systemd-logind[788]: New session 37 of user nova.
Nov 28 17:49:27 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:49:27 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:49:27 compute-0 systemd[215374]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:49:27 compute-0 systemd[215374]: Queued start job for default target Main User Target.
Nov 28 17:49:27 compute-0 systemd[215374]: Created slice User Application Slice.
Nov 28 17:49:27 compute-0 systemd[215374]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:49:27 compute-0 systemd[215374]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:49:27 compute-0 systemd[215374]: Reached target Paths.
Nov 28 17:49:27 compute-0 systemd[215374]: Reached target Timers.
Nov 28 17:49:27 compute-0 systemd[215374]: Starting D-Bus User Message Bus Socket...
Nov 28 17:49:27 compute-0 systemd[215374]: Starting Create User's Volatile Files and Directories...
Nov 28 17:49:28 compute-0 systemd[215374]: Finished Create User's Volatile Files and Directories.
Nov 28 17:49:28 compute-0 systemd[215374]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:49:28 compute-0 systemd[215374]: Reached target Sockets.
Nov 28 17:49:28 compute-0 systemd[215374]: Reached target Basic System.
Nov 28 17:49:28 compute-0 systemd[215374]: Reached target Main User Target.
Nov 28 17:49:28 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:49:28 compute-0 systemd[215374]: Startup finished in 174ms.
Nov 28 17:49:28 compute-0 systemd[1]: Started Session 37 of User nova.
Nov 28 17:49:28 compute-0 sshd-session[215370]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:49:28 compute-0 sshd-session[215389]: Received disconnect from 192.168.122.101 port 55632:11: disconnected by user
Nov 28 17:49:28 compute-0 sshd-session[215389]: Disconnected from user nova 192.168.122.101 port 55632
Nov 28 17:49:28 compute-0 sshd-session[215370]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:49:28 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Nov 28 17:49:28 compute-0 systemd-logind[788]: Session 37 logged out. Waiting for processes to exit.
Nov 28 17:49:28 compute-0 systemd-logind[788]: Removed session 37.
Nov 28 17:49:28 compute-0 nova_compute[187223]: 2025-11-28 17:49:28.247 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:29 compute-0 podman[215391]: 2025-11-28 17:49:29.226888983 +0000 UTC m=+0.080608197 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.328 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.529 187227 DEBUG nova.compute.manager [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.530 187227 DEBUG oslo_concurrency.lockutils [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.530 187227 DEBUG oslo_concurrency.lockutils [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.531 187227 DEBUG oslo_concurrency.lockutils [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.531 187227 DEBUG nova.compute.manager [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:29 compute-0 nova_compute[187223]: 2025-11-28 17:49:29.531 187227 DEBUG nova.compute.manager [req-e2df80d3-f73c-43dd-8a2f-34034c7f90fa req-9e243d5a-36da-4dec-8371-1f98942b4ddf 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:49:29 compute-0 podman[197556]: time="2025-11-28T17:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:49:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18322 "" "Go-http-client/1.1"
Nov 28 17:49:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.248 187227 INFO nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Took 7.93 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.249 187227 DEBUG nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.269 187227 DEBUG nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph7koyegb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(bd50638f-86cb-4105-8671-9e3b560c5829),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.304 187227 DEBUG nova.objects.instance [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.305 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.307 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.307 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.325 187227 DEBUG nova.virt.libvirt.vif [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:48:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-23084288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-23084288',id=18,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:48:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-rqnvb0oe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:48:39Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.326 187227 DEBUG nova.network.os_vif_util [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.327 187227 DEBUG nova.network.os_vif_util [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.327 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 17:49:30 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:1e:4f:32"/>
Nov 28 17:49:30 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 17:49:30 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:49:30 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 17:49:30 compute-0 nova_compute[187223]:   <target dev="tap9a913108-e2"/>
Nov 28 17:49:30 compute-0 nova_compute[187223]: </interface>
Nov 28 17:49:30 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.328 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.810 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.810 187227 INFO nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 17:49:30 compute-0 nova_compute[187223]: 2025-11-28 17:49:30.893 187227 INFO nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 17:49:31 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.397 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.398 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: ERROR   17:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: ERROR   17:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: ERROR   17:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: ERROR   17:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: ERROR   17:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:49:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.630 187227 DEBUG nova.compute.manager [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.630 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.630 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.630 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG nova.compute.manager [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 WARNING nova.compute.manager [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state migrating.
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG nova.compute.manager [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-changed-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG nova.compute.manager [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Refreshing instance network info cache due to event network-changed-9a913108-e2f3-42e3-92f5-51da2a1f5354. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.631 187227 DEBUG nova.network.neutron [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Refreshing network info cache for port 9a913108-e2f3-42e3-92f5-51da2a1f5354 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.901 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:49:31 compute-0 nova_compute[187223]: 2025-11-28 17:49:31.902 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.406 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.406 187227 DEBUG nova.virt.libvirt.migration [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.533 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352172.5332289, f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.534 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] VM Paused (Lifecycle Event)
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.551 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.555 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.582 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 17:49:32 compute-0 kernel: tap9a913108-e2 (unregistering): left promiscuous mode
Nov 28 17:49:32 compute-0 NetworkManager[55763]: <info>  [1764352172.6702] device (tap9a913108-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:49:32 compute-0 ovn_controller[95574]: 2025-11-28T17:49:32Z|00147|binding|INFO|Releasing lport 9a913108-e2f3-42e3-92f5-51da2a1f5354 from this chassis (sb_readonly=0)
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.678 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:32 compute-0 ovn_controller[95574]: 2025-11-28T17:49:32Z|00148|binding|INFO|Setting lport 9a913108-e2f3-42e3-92f5-51da2a1f5354 down in Southbound
Nov 28 17:49:32 compute-0 ovn_controller[95574]: 2025-11-28T17:49:32Z|00149|binding|INFO|Removing iface tap9a913108-e2 ovn-installed in OVS
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.682 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.685 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:4f:32 10.100.0.12'], port_security=['fa:16:3e:1e:4f:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=9a913108-e2f3-42e3-92f5-51da2a1f5354) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.687 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 9a913108-e2f3-42e3-92f5-51da2a1f5354 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.688 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.689 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[25881ebc-a030-4012-8571-45394ad94bc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.690 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.696 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:32 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Deactivated successfully.
Nov 28 17:49:32 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Consumed 15.194s CPU time.
Nov 28 17:49:32 compute-0 systemd-machined[153517]: Machine qemu-13-instance-00000012 terminated.
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [NOTICE]   (215128) : haproxy version is 2.8.14-c23fe91
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [NOTICE]   (215128) : path to executable is /usr/sbin/haproxy
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [WARNING]  (215128) : Exiting Master process...
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [WARNING]  (215128) : Exiting Master process...
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [ALERT]    (215128) : Current worker (215130) exited with code 143 (Terminated)
Nov 28 17:49:32 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215124]: [WARNING]  (215128) : All workers exited. Exiting... (0)
Nov 28 17:49:32 compute-0 systemd[1]: libpod-db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f.scope: Deactivated successfully.
Nov 28 17:49:32 compute-0 podman[215449]: 2025-11-28 17:49:32.843009837 +0000 UTC m=+0.050073027 container died db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:49:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-b9498ee30e2e0545a1b335bd5ddcfc909449b5e41595a7df5fa2603f0dcbbc71-merged.mount: Deactivated successfully.
Nov 28 17:49:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f-userdata-shm.mount: Deactivated successfully.
Nov 28 17:49:32 compute-0 podman[215449]: 2025-11-28 17:49:32.879933513 +0000 UTC m=+0.086996693 container cleanup db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.915 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:32 compute-0 systemd[1]: libpod-conmon-db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f.scope: Deactivated successfully.
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.948 187227 DEBUG nova.virt.libvirt.guest [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.948 187227 INFO nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migration operation has completed
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.949 187227 INFO nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] _post_live_migration() is started..
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.954 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.954 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.954 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 17:49:32 compute-0 podman[215487]: 2025-11-28 17:49:32.984391557 +0000 UTC m=+0.046012962 container remove db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.988 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e7826777-6cdb-4b1c-ad1b-d20d21d324e8]: (4, ('Fri Nov 28 05:49:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f)\ndb3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f\nFri Nov 28 05:49:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (db3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f)\ndb3e1a7d141fbc8b7af8d50d75b661aa684bc5c5d5cd31c68501ab623a00a28f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.990 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e88f45c7-7ee4-426c-87bc-a7669286f8f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:32.991 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:49:32 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:49:32 compute-0 nova_compute[187223]: 2025-11-28 17:49:32.994 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.007 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.010 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e49e85bf-7087-4285-8d80-9a6ae399f599]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.024 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[05962c0b-6598-4840-83c8-9a82b6362a2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.025 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[04d98bc7-6f3d-473b-9c99-ddbd966cc3d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.038 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0d22c49b-7845-45e6-8810-b4cb87deaae6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529385, 'reachable_time': 39261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215512, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:33 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.043 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:49:33 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:33.043 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[5f38d0fc-eb8a-47ce-889e-24b93ed39a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.251 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.763 187227 DEBUG nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.764 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.764 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.764 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.764 187227 DEBUG nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.764 187227 DEBUG nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 DEBUG nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 DEBUG oslo_concurrency.lockutils [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 DEBUG nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:33 compute-0 nova_compute[187223]: 2025-11-28 17:49:33.765 187227 WARNING nova.compute.manager [req-e87ac484-6cc2-4cd7-95bb-515ad21de8ab req-f1a83e67-eadb-4a7f-b4a6-58bf81fdce92 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state migrating.
Nov 28 17:49:34 compute-0 nova_compute[187223]: 2025-11-28 17:49:34.329 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:34 compute-0 nova_compute[187223]: 2025-11-28 17:49:34.585 187227 DEBUG nova.network.neutron [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updated VIF entry in instance network info cache for port 9a913108-e2f3-42e3-92f5-51da2a1f5354. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:49:34 compute-0 nova_compute[187223]: 2025-11-28 17:49:34.586 187227 DEBUG nova.network.neutron [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Updating instance_info_cache with network_info: [{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:49:34 compute-0 nova_compute[187223]: 2025-11-28 17:49:34.630 187227 DEBUG oslo_concurrency.lockutils [req-de9a9b7b-08e1-46a0-900c-e9ffca501faf req-00853e5e-cf8e-44f2-b736-a62b2519664d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:49:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:34.795 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:49:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:34.796 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:49:34 compute-0 nova_compute[187223]: 2025-11-28 17:49:34.797 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.178 187227 DEBUG nova.network.neutron [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Activated binding for port 9a913108-e2f3-42e3-92f5-51da2a1f5354 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.178 187227 DEBUG nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.179 187227 DEBUG nova.virt.libvirt.vif [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:48:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-23084288',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-23084288',id=18,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:48:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-rqnvb0oe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:49:13Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.179 187227 DEBUG nova.network.os_vif_util [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "address": "fa:16:3e:1e:4f:32", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a913108-e2", "ovs_interfaceid": "9a913108-e2f3-42e3-92f5-51da2a1f5354", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.180 187227 DEBUG nova.network.os_vif_util [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.180 187227 DEBUG os_vif [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.182 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.182 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a913108-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.184 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.185 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.186 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.189 187227 INFO os_vif [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:4f:32,bridge_name='br-int',has_traffic_filtering=True,id=9a913108-e2f3-42e3-92f5-51da2a1f5354,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a913108-e2')
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.189 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.189 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.189 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.190 187227 DEBUG nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.190 187227 INFO nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Deleting instance files /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a_del
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.190 187227 INFO nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Deletion of /var/lib/nova/instances/f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a_del complete
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.889 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.890 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.891 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.892 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.892 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.892 187227 WARNING nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state migrating.
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.893 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.893 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.893 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.894 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.894 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.894 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-unplugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.895 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.895 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.895 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.896 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.896 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.896 187227 WARNING nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state migrating.
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.897 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.897 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.897 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.897 187227 DEBUG oslo_concurrency.lockutils [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.898 187227 DEBUG nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] No waiting events found dispatching network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:49:35 compute-0 nova_compute[187223]: 2025-11-28 17:49:35.898 187227 WARNING nova.compute.manager [req-a48ceb7d-e2a3-485f-b10e-d07fd6d4e807 req-dce7f145-81d3-4ba0-bf79-9388dd5b1a1c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Received unexpected event network-vif-plugged-9a913108-e2f3-42e3-92f5-51da2a1f5354 for instance with vm_state active and task_state migrating.
Nov 28 17:49:36 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:49:36.799 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:49:37 compute-0 nova_compute[187223]: 2025-11-28 17:49:37.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:38 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:49:38 compute-0 systemd[215374]: Activating special unit Exit the Session...
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped target Main User Target.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped target Basic System.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped target Paths.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped target Sockets.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped target Timers.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:49:38 compute-0 systemd[215374]: Closed D-Bus User Message Bus Socket.
Nov 28 17:49:38 compute-0 systemd[215374]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:49:38 compute-0 systemd[215374]: Removed slice User Application Slice.
Nov 28 17:49:38 compute-0 systemd[215374]: Reached target Shutdown.
Nov 28 17:49:38 compute-0 systemd[215374]: Finished Exit the Session.
Nov 28 17:49:38 compute-0 systemd[215374]: Reached target Exit the Session.
Nov 28 17:49:38 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:49:38 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:49:38 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:49:38 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:49:38 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:49:38 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:49:38 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:49:39 compute-0 nova_compute[187223]: 2025-11-28 17:49:39.331 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:39 compute-0 nova_compute[187223]: 2025-11-28 17:49:39.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:39 compute-0 nova_compute[187223]: 2025-11-28 17:49:39.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.185 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.960 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.961 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.961 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.989 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.990 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.991 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:40 compute-0 nova_compute[187223]: 2025-11-28 17:49:40.991 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.188 187227 WARNING nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.189 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5862MB free_disk=73.3404655456543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.189 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.190 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.234 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration for instance f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.262 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.309 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration bd50638f-86cb-4105-8671-9e3b560c5829 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.310 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.310 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.362 187227 DEBUG nova.compute.provider_tree [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.377 187227 DEBUG nova.scheduler.client.report [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.402 187227 DEBUG nova.compute.resource_tracker [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.403 187227 DEBUG oslo_concurrency.lockutils [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.408 187227 INFO nova.compute.manager [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.534 187227 INFO nova.scheduler.client.report [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Deleted allocation for migration bd50638f-86cb-4105-8671-9e3b560c5829
Nov 28 17:49:41 compute-0 nova_compute[187223]: 2025-11-28 17:49:41.534 187227 DEBUG nova.virt.libvirt.driver [None req-0eca0c7f-3e44-45db-92cf-40c2b15f44da a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 17:49:42 compute-0 podman[215516]: 2025-11-28 17:49:42.23146424 +0000 UTC m=+0.092297340 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.717 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.950 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.951 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.3404655456543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.951 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:49:42 compute-0 nova_compute[187223]: 2025-11-28 17:49:42.952 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.012 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.012 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.047 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.065 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.066 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:49:43 compute-0 nova_compute[187223]: 2025-11-28 17:49:43.066 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:49:44 compute-0 nova_compute[187223]: 2025-11-28 17:49:44.332 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:45 compute-0 nova_compute[187223]: 2025-11-28 17:49:45.188 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:46 compute-0 nova_compute[187223]: 2025-11-28 17:49:46.067 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:46 compute-0 nova_compute[187223]: 2025-11-28 17:49:46.068 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:49:46 compute-0 nova_compute[187223]: 2025-11-28 17:49:46.068 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:49:46 compute-0 nova_compute[187223]: 2025-11-28 17:49:46.106 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:49:46 compute-0 nova_compute[187223]: 2025-11-28 17:49:46.717 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:47 compute-0 nova_compute[187223]: 2025-11-28 17:49:47.949 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352172.9474142, f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:49:47 compute-0 nova_compute[187223]: 2025-11-28 17:49:47.949 187227 INFO nova.compute.manager [-] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] VM Stopped (Lifecycle Event)
Nov 28 17:49:47 compute-0 nova_compute[187223]: 2025-11-28 17:49:47.980 187227 DEBUG nova.compute.manager [None req-76d23b5f-d012-45c5-9785-e260bba4019b - - - - - -] [instance: f6f71e02-b8fb-4c18-9a1b-aa0d8c90776a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:49:48 compute-0 nova_compute[187223]: 2025-11-28 17:49:48.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:49 compute-0 nova_compute[187223]: 2025-11-28 17:49:49.335 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:50 compute-0 nova_compute[187223]: 2025-11-28 17:49:50.191 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:51 compute-0 podman[215542]: 2025-11-28 17:49:51.188733093 +0000 UTC m=+0.055140409 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:49:51 compute-0 nova_compute[187223]: 2025-11-28 17:49:51.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:49:54 compute-0 nova_compute[187223]: 2025-11-28 17:49:54.337 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:55 compute-0 nova_compute[187223]: 2025-11-28 17:49:55.193 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:57 compute-0 podman[215561]: 2025-11-28 17:49:57.202886141 +0000 UTC m=+0.070237478 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 17:49:57 compute-0 podman[215562]: 2025-11-28 17:49:57.218188847 +0000 UTC m=+0.079745204 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 17:49:59 compute-0 nova_compute[187223]: 2025-11-28 17:49:59.339 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:49:59 compute-0 podman[197556]: time="2025-11-28T17:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:49:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:49:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Nov 28 17:50:00 compute-0 nova_compute[187223]: 2025-11-28 17:50:00.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:00 compute-0 podman[215604]: 2025-11-28 17:50:00.22583384 +0000 UTC m=+0.085352470 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: ERROR   17:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: ERROR   17:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: ERROR   17:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: ERROR   17:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: ERROR   17:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:50:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:50:04 compute-0 nova_compute[187223]: 2025-11-28 17:50:04.343 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:05 compute-0 nova_compute[187223]: 2025-11-28 17:50:05.197 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:09 compute-0 sshd-session[215625]: Invalid user solana from 193.32.162.146 port 43216
Nov 28 17:50:09 compute-0 sshd-session[215625]: Connection closed by invalid user solana 193.32.162.146 port 43216 [preauth]
Nov 28 17:50:09 compute-0 nova_compute[187223]: 2025-11-28 17:50:09.344 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:10 compute-0 nova_compute[187223]: 2025-11-28 17:50:10.199 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.267 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.267 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.291 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.379 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.380 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.388 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.388 187227 INFO nova.compute.claims [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.528 187227 DEBUG nova.compute.provider_tree [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.542 187227 DEBUG nova.scheduler.client.report [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.564 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.565 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.623 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.624 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.659 187227 INFO nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.684 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.835 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.837 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.838 187227 INFO nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Creating image(s)
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.840 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.841 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.842 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.872 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.936 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.937 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.938 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:11 compute-0 nova_compute[187223]: 2025-11-28 17:50:11.949 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.012 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.013 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.042 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.043 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.043 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.098 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.100 187227 DEBUG nova.virt.disk.api [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.100 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.160 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.161 187227 DEBUG nova.virt.disk.api [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.161 187227 DEBUG nova.objects.instance [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.185 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.185 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Ensure instance console log exists: /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.186 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.186 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.186 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:12 compute-0 nova_compute[187223]: 2025-11-28 17:50:12.790 187227 DEBUG nova.policy [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:50:13 compute-0 podman[215642]: 2025-11-28 17:50:13.223969221 +0000 UTC m=+0.086597482 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:50:14 compute-0 nova_compute[187223]: 2025-11-28 17:50:14.345 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:15 compute-0 nova_compute[187223]: 2025-11-28 17:50:15.005 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Successfully created port: 04f9669d-47d8-4dc0-ab98-04885312a101 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:50:15 compute-0 nova_compute[187223]: 2025-11-28 17:50:15.201 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:17 compute-0 nova_compute[187223]: 2025-11-28 17:50:17.862 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Successfully updated port: 04f9669d-47d8-4dc0-ab98-04885312a101 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:50:17 compute-0 nova_compute[187223]: 2025-11-28 17:50:17.895 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:50:17 compute-0 nova_compute[187223]: 2025-11-28 17:50:17.895 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:50:17 compute-0 nova_compute[187223]: 2025-11-28 17:50:17.896 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:50:18 compute-0 nova_compute[187223]: 2025-11-28 17:50:18.226 187227 DEBUG nova.compute.manager [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-changed-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:50:18 compute-0 nova_compute[187223]: 2025-11-28 17:50:18.227 187227 DEBUG nova.compute.manager [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Refreshing instance network info cache due to event network-changed-04f9669d-47d8-4dc0-ab98-04885312a101. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:50:18 compute-0 nova_compute[187223]: 2025-11-28 17:50:18.228 187227 DEBUG oslo_concurrency.lockutils [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:50:18 compute-0 nova_compute[187223]: 2025-11-28 17:50:18.766 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:50:19 compute-0 nova_compute[187223]: 2025-11-28 17:50:19.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.205 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.303 187227 DEBUG nova.network.neutron [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updating instance_info_cache with network_info: [{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.329 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.329 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Instance network_info: |[{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.330 187227 DEBUG oslo_concurrency.lockutils [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.330 187227 DEBUG nova.network.neutron [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Refreshing network info cache for port 04f9669d-47d8-4dc0-ab98-04885312a101 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.334 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Start _get_guest_xml network_info=[{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.338 187227 WARNING nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.344 187227 DEBUG nova.virt.libvirt.host [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.344 187227 DEBUG nova.virt.libvirt.host [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.348 187227 DEBUG nova.virt.libvirt.host [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.348 187227 DEBUG nova.virt.libvirt.host [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.349 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.349 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.350 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.350 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.350 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.350 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.350 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.351 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.351 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.351 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.351 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.351 187227 DEBUG nova.virt.hardware [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.355 187227 DEBUG nova.virt.libvirt.vif [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:50:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1229670702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1229670702',id=19,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-yamterc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:50:11Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=4fb5c36e-4336-4f9b-a18a-182d79fc7fb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.355 187227 DEBUG nova.network.os_vif_util [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.356 187227 DEBUG nova.network.os_vif_util [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.356 187227 DEBUG nova.objects.instance [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.373 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <uuid>4fb5c36e-4336-4f9b-a18a-182d79fc7fb1</uuid>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <name>instance-00000013</name>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-1229670702</nova:name>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:50:20</nova:creationTime>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         <nova:port uuid="04f9669d-47d8-4dc0-ab98-04885312a101">
Nov 28 17:50:20 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <system>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="serial">4fb5c36e-4336-4f9b-a18a-182d79fc7fb1</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="uuid">4fb5c36e-4336-4f9b-a18a-182d79fc7fb1</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </system>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <os>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </os>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <features>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </features>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.config"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:b1:be:39"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <target dev="tap04f9669d-47"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/console.log" append="off"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <video>
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </video>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:50:20 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:50:20 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:50:20 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:50:20 compute-0 nova_compute[187223]: </domain>
Nov 28 17:50:20 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.374 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Preparing to wait for external event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.375 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.375 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.375 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.376 187227 DEBUG nova.virt.libvirt.vif [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:50:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1229670702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1229670702',id=19,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-yamterc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:50:11Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=4fb5c36e-4336-4f9b-a18a-182d79fc7fb1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.376 187227 DEBUG nova.network.os_vif_util [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.376 187227 DEBUG nova.network.os_vif_util [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.377 187227 DEBUG os_vif [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.377 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.377 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.378 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.380 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.381 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04f9669d-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.381 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04f9669d-47, col_values=(('external_ids', {'iface-id': '04f9669d-47d8-4dc0-ab98-04885312a101', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:be:39', 'vm-uuid': '4fb5c36e-4336-4f9b-a18a-182d79fc7fb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.382 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 NetworkManager[55763]: <info>  [1764352220.3842] manager: (tap04f9669d-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.384 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.389 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.390 187227 INFO os_vif [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47')
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.512 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.512 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.512 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:b1:be:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:50:20 compute-0 nova_compute[187223]: 2025-11-28 17:50:20.513 187227 INFO nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Using config drive
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.167 187227 INFO nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Creating config drive at /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.config
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.173 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2urz4sky execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.313 187227 DEBUG oslo_concurrency.processutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2urz4sky" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:21 compute-0 kernel: tap04f9669d-47: entered promiscuous mode
Nov 28 17:50:21 compute-0 ovn_controller[95574]: 2025-11-28T17:50:21Z|00150|binding|INFO|Claiming lport 04f9669d-47d8-4dc0-ab98-04885312a101 for this chassis.
Nov 28 17:50:21 compute-0 ovn_controller[95574]: 2025-11-28T17:50:21Z|00151|binding|INFO|04f9669d-47d8-4dc0-ab98-04885312a101: Claiming fa:16:3e:b1:be:39 10.100.0.12
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.3843] manager: (tap04f9669d-47): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.386 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.391 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:be:39 10.100.0.12'], port_security=['fa:16:3e:b1:be:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4fb5c36e-4336-4f9b-a18a-182d79fc7fb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=04f9669d-47d8-4dc0-ab98-04885312a101) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.392 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 04f9669d-47d8-4dc0-ab98-04885312a101 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.393 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:50:21 compute-0 ovn_controller[95574]: 2025-11-28T17:50:21Z|00152|binding|INFO|Setting lport 04f9669d-47d8-4dc0-ab98-04885312a101 ovn-installed in OVS
Nov 28 17:50:21 compute-0 ovn_controller[95574]: 2025-11-28T17:50:21Z|00153|binding|INFO|Setting lport 04f9669d-47d8-4dc0-ab98-04885312a101 up in Southbound
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.397 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.400 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.405 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[65359617-0b61-4297-bc11-9b4f12e55145]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.406 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.410 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.410 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[c135f96e-d330-4f78-a0eb-f61c78cfa106]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.411 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1557af97-158a-49f8-a464-0070ec752a45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 systemd-udevd[215696]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.422 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[4d296847-41f1-400e-adb4-096bc508cfb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 systemd-machined[153517]: New machine qemu-14-instance-00000013.
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.4344] device (tap04f9669d-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.4352] device (tap04f9669d-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.438 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[260ad7bc-0d2d-4624-a553-defe258426c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Nov 28 17:50:21 compute-0 podman[215679]: 2025-11-28 17:50:21.462616381 +0000 UTC m=+0.084214760 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.473 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[19946951-ed89-4e28-bc2b-c08d787137f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.478 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b2207dbf-f6fe-4acd-9ebf-689056a1f349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.4797] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Nov 28 17:50:21 compute-0 systemd-udevd[215707]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.504 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[394efb02-1a9c-4dc6-964c-18484a2242eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.507 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[701f9104-600e-4428-8623-d15825aec14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.5295] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.535 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[e206adfa-be4b-4292-9502-505b77cdeb78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.553 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[65604289-2400-4208-ad22-3a5961e07483]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539615, 'reachable_time': 30259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215737, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.570 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d1dcd3db-5b55-45c6-a107-11a7824e243f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539615, 'tstamp': 539615}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215738, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.589 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[be1a9b95-d38d-4872-8bda-4fab98bc3bcf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 45], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539615, 'reachable_time': 30259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215739, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.627 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c7037c-daee-4e22-aa21-284c72309c38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.686 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[43282523-75c6-4a34-b504-f93cb1636d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.688 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.689 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.689 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.690 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 NetworkManager[55763]: <info>  [1764352221.6912] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 28 17:50:21 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.694 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:21 compute-0 ovn_controller[95574]: 2025-11-28T17:50:21Z|00154|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.695 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.697 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.698 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[56592d3e-aeac-4efa-a3f4-d0a7346bb425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.698 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:50:21 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:21.699 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.706 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.734 187227 DEBUG nova.compute.manager [req-7514ac1e-1ff4-4efc-8e99-0a48ea6edded req-6a2fa4bc-e5fa-4463-96e5-39d192b5f3ce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.734 187227 DEBUG oslo_concurrency.lockutils [req-7514ac1e-1ff4-4efc-8e99-0a48ea6edded req-6a2fa4bc-e5fa-4463-96e5-39d192b5f3ce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.734 187227 DEBUG oslo_concurrency.lockutils [req-7514ac1e-1ff4-4efc-8e99-0a48ea6edded req-6a2fa4bc-e5fa-4463-96e5-39d192b5f3ce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.735 187227 DEBUG oslo_concurrency.lockutils [req-7514ac1e-1ff4-4efc-8e99-0a48ea6edded req-6a2fa4bc-e5fa-4463-96e5-39d192b5f3ce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.735 187227 DEBUG nova.compute.manager [req-7514ac1e-1ff4-4efc-8e99-0a48ea6edded req-6a2fa4bc-e5fa-4463-96e5-39d192b5f3ce 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Processing event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.750 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.751 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352221.7498941, 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.751 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] VM Started (Lifecycle Event)
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.754 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.757 187227 INFO nova.virt.libvirt.driver [-] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Instance spawned successfully.
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.757 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.806 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.809 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.819 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.820 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.820 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.821 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.821 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.822 187227 DEBUG nova.virt.libvirt.driver [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.880 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.880 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352221.750685, 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.881 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] VM Paused (Lifecycle Event)
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.956 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.960 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352221.7543263, 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:50:21 compute-0 nova_compute[187223]: 2025-11-28 17:50:21.961 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] VM Resumed (Lifecycle Event)
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.008 187227 INFO nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Took 10.17 seconds to spawn the instance on the hypervisor.
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.009 187227 DEBUG nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.119 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.124 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:50:22 compute-0 podman[215778]: 2025-11-28 17:50:22.135888069 +0000 UTC m=+0.090447953 container create ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.158 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:50:22 compute-0 podman[215778]: 2025-11-28 17:50:22.071279376 +0000 UTC m=+0.025839280 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:50:22 compute-0 systemd[1]: Started libpod-conmon-ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23.scope.
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.198 187227 INFO nova.compute.manager [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Took 10.85 seconds to build instance.
Nov 28 17:50:22 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:50:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0981fe86c6f1698aa0d0c8ae4c1317fc7c43cda0bc701bb7cd9c4727c0cf0aab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:50:22 compute-0 podman[215778]: 2025-11-28 17:50:22.22593278 +0000 UTC m=+0.180492684 container init ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.232 187227 DEBUG oslo_concurrency.lockutils [None req-03c92bee-18f2-4df5-8e11-2183afc3f9aa 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:22 compute-0 podman[215778]: 2025-11-28 17:50:22.233580508 +0000 UTC m=+0.188140392 container start ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:50:22 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [NOTICE]   (215797) : New worker (215799) forked
Nov 28 17:50:22 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [NOTICE]   (215797) : Loading success.
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.335 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:22.335 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:50:22 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:22.338 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.437 187227 DEBUG nova.network.neutron [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updated VIF entry in instance network info cache for port 04f9669d-47d8-4dc0-ab98-04885312a101. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.439 187227 DEBUG nova.network.neutron [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updating instance_info_cache with network_info: [{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:50:22 compute-0 nova_compute[187223]: 2025-11-28 17:50:22.470 187227 DEBUG oslo_concurrency.lockutils [req-f9af2156-ea39-4a90-85d5-3a98a0de4050 req-7f6a481a-326c-4681-ae23-0e4898a22e22 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.853 187227 DEBUG nova.compute.manager [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.854 187227 DEBUG oslo_concurrency.lockutils [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.854 187227 DEBUG oslo_concurrency.lockutils [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.854 187227 DEBUG oslo_concurrency.lockutils [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.854 187227 DEBUG nova.compute.manager [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:50:23 compute-0 nova_compute[187223]: 2025-11-28 17:50:23.855 187227 WARNING nova.compute.manager [req-77079213-a83b-4994-b567-92307c39ca78 req-d09148cd-c06c-433b-a107-776d012bdf01 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received unexpected event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with vm_state active and task_state None.
Nov 28 17:50:24 compute-0 nova_compute[187223]: 2025-11-28 17:50:24.349 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:25 compute-0 nova_compute[187223]: 2025-11-28 17:50:25.384 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:27.703 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:27.703 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:27.704 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:28 compute-0 podman[215808]: 2025-11-28 17:50:28.202889026 +0000 UTC m=+0.067815737 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 17:50:28 compute-0 podman[215809]: 2025-11-28 17:50:28.234384321 +0000 UTC m=+0.086618683 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 28 17:50:29 compute-0 nova_compute[187223]: 2025-11-28 17:50:29.351 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:29 compute-0 podman[197556]: time="2025-11-28T17:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:50:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:50:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Nov 28 17:50:30 compute-0 nova_compute[187223]: 2025-11-28 17:50:30.385 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:31 compute-0 podman[215850]: 2025-11-28 17:50:31.221955265 +0000 UTC m=+0.066300847 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: ERROR   17:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: ERROR   17:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: ERROR   17:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: ERROR   17:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: ERROR   17:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:50:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:50:32 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:50:32.340 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:50:34 compute-0 nova_compute[187223]: 2025-11-28 17:50:34.353 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:34 compute-0 ovn_controller[95574]: 2025-11-28T17:50:34Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:be:39 10.100.0.12
Nov 28 17:50:34 compute-0 ovn_controller[95574]: 2025-11-28T17:50:34Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:be:39 10.100.0.12
Nov 28 17:50:35 compute-0 nova_compute[187223]: 2025-11-28 17:50:35.389 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:39 compute-0 nova_compute[187223]: 2025-11-28 17:50:39.355 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:39 compute-0 nova_compute[187223]: 2025-11-28 17:50:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:40 compute-0 nova_compute[187223]: 2025-11-28 17:50:40.391 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:40 compute-0 nova_compute[187223]: 2025-11-28 17:50:40.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:40 compute-0 nova_compute[187223]: 2025-11-28 17:50:40.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:50:41 compute-0 nova_compute[187223]: 2025-11-28 17:50:41.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:41 compute-0 nova_compute[187223]: 2025-11-28 17:50:41.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:42 compute-0 nova_compute[187223]: 2025-11-28 17:50:42.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:44 compute-0 podman[215891]: 2025-11-28 17:50:44.185125714 +0000 UTC m=+0.050485472 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.357 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:44 compute-0 sshd-session[215915]: Connection closed by 193.32.162.145 port 36984
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.717 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.804 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.858 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.859 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:50:44 compute-0 nova_compute[187223]: 2025-11-28 17:50:44.912 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.049 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.051 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5699MB free_disk=73.31290817260742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.051 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.051 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.166 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.167 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.167 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.281 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.299 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.348 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.348 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:50:45 compute-0 nova_compute[187223]: 2025-11-28 17:50:45.394 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.349 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.350 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.350 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.790 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.791 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.791 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:50:47 compute-0 nova_compute[187223]: 2025-11-28 17:50:47.791 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:50:49 compute-0 nova_compute[187223]: 2025-11-28 17:50:49.359 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:50 compute-0 nova_compute[187223]: 2025-11-28 17:50:50.297 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updating instance_info_cache with network_info: [{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:50:50 compute-0 nova_compute[187223]: 2025-11-28 17:50:50.396 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:52 compute-0 podman[215924]: 2025-11-28 17:50:52.187617195 +0000 UTC m=+0.043636255 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:50:53 compute-0 nova_compute[187223]: 2025-11-28 17:50:53.959 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:50:53 compute-0 nova_compute[187223]: 2025-11-28 17:50:53.960 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:50:53 compute-0 nova_compute[187223]: 2025-11-28 17:50:53.960 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:53 compute-0 nova_compute[187223]: 2025-11-28 17:50:53.960 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:54 compute-0 nova_compute[187223]: 2025-11-28 17:50:54.288 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:50:54 compute-0 nova_compute[187223]: 2025-11-28 17:50:54.361 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:55 compute-0 nova_compute[187223]: 2025-11-28 17:50:55.398 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:59 compute-0 podman[215943]: 2025-11-28 17:50:59.228833338 +0000 UTC m=+0.089735001 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 17:50:59 compute-0 podman[215944]: 2025-11-28 17:50:59.239975658 +0000 UTC m=+0.093603712 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:50:59 compute-0 nova_compute[187223]: 2025-11-28 17:50:59.409 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:50:59 compute-0 podman[197556]: time="2025-11-28T17:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:50:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:50:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 28 17:51:00 compute-0 nova_compute[187223]: 2025-11-28 17:51:00.401 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: ERROR   17:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: ERROR   17:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: ERROR   17:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: ERROR   17:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: ERROR   17:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:51:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:51:02 compute-0 podman[215988]: 2025-11-28 17:51:02.196581488 +0000 UTC m=+0.063653791 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible)
Nov 28 17:51:04 compute-0 nova_compute[187223]: 2025-11-28 17:51:04.412 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:05 compute-0 nova_compute[187223]: 2025-11-28 17:51:05.404 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:05 compute-0 ovn_controller[95574]: 2025-11-28T17:51:05Z|00155|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:51:09 compute-0 nova_compute[187223]: 2025-11-28 17:51:09.414 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:10 compute-0 nova_compute[187223]: 2025-11-28 17:51:10.407 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:14 compute-0 nova_compute[187223]: 2025-11-28 17:51:14.417 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:15 compute-0 podman[216010]: 2025-11-28 17:51:15.196835386 +0000 UTC m=+0.051286084 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:51:15 compute-0 nova_compute[187223]: 2025-11-28 17:51:15.363 187227 DEBUG nova.compute.manager [None req-bed160aa-b5ef-40dd-a9e8-03ee94426983 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Nov 28 17:51:15 compute-0 nova_compute[187223]: 2025-11-28 17:51:15.410 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:15 compute-0 nova_compute[187223]: 2025-11-28 17:51:15.429 187227 DEBUG nova.compute.provider_tree [None req-bed160aa-b5ef-40dd-a9e8-03ee94426983 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 22 to 28 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:51:19 compute-0 nova_compute[187223]: 2025-11-28 17:51:19.419 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:20 compute-0 nova_compute[187223]: 2025-11-28 17:51:20.413 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:20 compute-0 nova_compute[187223]: 2025-11-28 17:51:20.958 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Check if temp file /var/lib/nova/instances/tmpvvjilafv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 17:51:20 compute-0 nova_compute[187223]: 2025-11-28 17:51:20.959 187227 DEBUG nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvvjilafv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4fb5c36e-4336-4f9b-a18a-182d79fc7fb1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 17:51:22 compute-0 nova_compute[187223]: 2025-11-28 17:51:22.278 187227 DEBUG oslo_concurrency.processutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:51:22 compute-0 nova_compute[187223]: 2025-11-28 17:51:22.354 187227 DEBUG oslo_concurrency.processutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:51:22 compute-0 nova_compute[187223]: 2025-11-28 17:51:22.356 187227 DEBUG oslo_concurrency.processutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:51:22 compute-0 nova_compute[187223]: 2025-11-28 17:51:22.452 187227 DEBUG oslo_concurrency.processutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:51:23 compute-0 podman[216040]: 2025-11-28 17:51:23.192882862 +0000 UTC m=+0.054373614 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 17:51:24 compute-0 nova_compute[187223]: 2025-11-28 17:51:24.422 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:25 compute-0 sshd-session[216059]: Accepted publickey for nova from 192.168.122.101 port 39468 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:51:25 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:51:25 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:51:25 compute-0 systemd-logind[788]: New session 39 of user nova.
Nov 28 17:51:25 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:51:25 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:51:25 compute-0 systemd[216063]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:51:25 compute-0 systemd[216063]: Queued start job for default target Main User Target.
Nov 28 17:51:25 compute-0 systemd[216063]: Created slice User Application Slice.
Nov 28 17:51:25 compute-0 systemd[216063]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:51:25 compute-0 systemd[216063]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:51:25 compute-0 systemd[216063]: Reached target Paths.
Nov 28 17:51:25 compute-0 systemd[216063]: Reached target Timers.
Nov 28 17:51:25 compute-0 systemd[216063]: Starting D-Bus User Message Bus Socket...
Nov 28 17:51:25 compute-0 systemd[216063]: Starting Create User's Volatile Files and Directories...
Nov 28 17:51:25 compute-0 systemd[216063]: Finished Create User's Volatile Files and Directories.
Nov 28 17:51:25 compute-0 systemd[216063]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:51:25 compute-0 systemd[216063]: Reached target Sockets.
Nov 28 17:51:25 compute-0 systemd[216063]: Reached target Basic System.
Nov 28 17:51:25 compute-0 systemd[216063]: Reached target Main User Target.
Nov 28 17:51:25 compute-0 systemd[216063]: Startup finished in 171ms.
Nov 28 17:51:25 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:51:25 compute-0 systemd[1]: Started Session 39 of User nova.
Nov 28 17:51:25 compute-0 sshd-session[216059]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:51:25 compute-0 nova_compute[187223]: 2025-11-28 17:51:25.415 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:25 compute-0 sshd-session[216078]: Received disconnect from 192.168.122.101 port 39468:11: disconnected by user
Nov 28 17:51:25 compute-0 sshd-session[216078]: Disconnected from user nova 192.168.122.101 port 39468
Nov 28 17:51:25 compute-0 sshd-session[216059]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:51:25 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Nov 28 17:51:25 compute-0 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Nov 28 17:51:25 compute-0 systemd-logind[788]: Removed session 39.
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.536 187227 DEBUG nova.compute.manager [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.537 187227 DEBUG oslo_concurrency.lockutils [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.538 187227 DEBUG oslo_concurrency.lockutils [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.538 187227 DEBUG oslo_concurrency.lockutils [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.539 187227 DEBUG nova.compute.manager [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:26 compute-0 nova_compute[187223]: 2025-11-28 17:51:26.539 187227 DEBUG nova.compute.manager [req-8f7bb8b5-41c7-4b98-8bef-4405fb022e77 req-1a8b8253-6bbd-4df5-aa2e-9e979bcb0144 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:51:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:27.704 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:27.705 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:27.999 187227 INFO nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Took 5.54 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.000 187227 DEBUG nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.044 187227 DEBUG nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpvvjilafv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4fb5c36e-4336-4f9b-a18a-182d79fc7fb1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(1b9d403e-cfe5-4b5d-b993-796562c2b8b8),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.074 187227 DEBUG nova.objects.instance [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.075 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.076 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.076 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.091 187227 DEBUG nova.virt.libvirt.vif [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:50:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1229670702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1229670702',id=19,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:50:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-yamterc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:50:22Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=4fb5c36e-4336-4f9b-a18a-182d79fc7fb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.092 187227 DEBUG nova.network.os_vif_util [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.092 187227 DEBUG nova.network.os_vif_util [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.093 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 17:51:28 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:b1:be:39"/>
Nov 28 17:51:28 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 17:51:28 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:51:28 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 17:51:28 compute-0 nova_compute[187223]:   <target dev="tap04f9669d-47"/>
Nov 28 17:51:28 compute-0 nova_compute[187223]: </interface>
Nov 28 17:51:28 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.093 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.579 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.580 187227 INFO nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.664 187227 DEBUG nova.compute.manager [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.665 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.665 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.666 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.666 187227 DEBUG nova.compute.manager [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.666 187227 WARNING nova.compute.manager [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received unexpected event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with vm_state active and task_state migrating.
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.667 187227 DEBUG nova.compute.manager [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-changed-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.667 187227 DEBUG nova.compute.manager [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Refreshing instance network info cache due to event network-changed-04f9669d-47d8-4dc0-ab98-04885312a101. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.667 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.668 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.668 187227 DEBUG nova.network.neutron [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Refreshing network info cache for port 04f9669d-47d8-4dc0-ab98-04885312a101 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:51:28 compute-0 nova_compute[187223]: 2025-11-28 17:51:28.694 187227 INFO nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 17:51:29 compute-0 nova_compute[187223]: 2025-11-28 17:51:29.196 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:51:29 compute-0 nova_compute[187223]: 2025-11-28 17:51:29.197 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:51:29 compute-0 nova_compute[187223]: 2025-11-28 17:51:29.424 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:29 compute-0 nova_compute[187223]: 2025-11-28 17:51:29.701 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:51:29 compute-0 nova_compute[187223]: 2025-11-28 17:51:29.702 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:51:29 compute-0 podman[197556]: time="2025-11-28T17:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:51:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:51:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.205 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.206 187227 DEBUG nova.virt.libvirt.migration [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:51:30 compute-0 podman[216096]: 2025-11-28 17:51:30.227411941 +0000 UTC m=+0.081337448 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:51:30 compute-0 podman[216097]: 2025-11-28 17:51:30.261666936 +0000 UTC m=+0.104786293 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.269 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352290.2688653, 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.269 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] VM Paused (Lifecycle Event)
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.294 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.299 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.325 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 17:51:30 compute-0 kernel: tap04f9669d-47 (unregistering): left promiscuous mode
Nov 28 17:51:30 compute-0 NetworkManager[55763]: <info>  [1764352290.4105] device (tap04f9669d-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:51:30 compute-0 ovn_controller[95574]: 2025-11-28T17:51:30Z|00156|binding|INFO|Releasing lport 04f9669d-47d8-4dc0-ab98-04885312a101 from this chassis (sb_readonly=0)
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.418 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 ovn_controller[95574]: 2025-11-28T17:51:30Z|00157|binding|INFO|Setting lport 04f9669d-47d8-4dc0-ab98-04885312a101 down in Southbound
Nov 28 17:51:30 compute-0 ovn_controller[95574]: 2025-11-28T17:51:30Z|00158|binding|INFO|Removing iface tap04f9669d-47 ovn-installed in OVS
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.422 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.424 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.431 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:be:39 10.100.0.12'], port_security=['fa:16:3e:b1:be:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4fb5c36e-4336-4f9b-a18a-182d79fc7fb1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=04f9669d-47d8-4dc0-ab98-04885312a101) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.438 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 04f9669d-47d8-4dc0-ab98-04885312a101 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.443 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.446 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cc456daf-8cb5-4517-a49b-1525f675ba4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.448 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.453 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 28 17:51:30 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 15.600s CPU time.
Nov 28 17:51:30 compute-0 systemd-machined[153517]: Machine qemu-14-instance-00000013 terminated.
Nov 28 17:51:30 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [NOTICE]   (215797) : haproxy version is 2.8.14-c23fe91
Nov 28 17:51:30 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [NOTICE]   (215797) : path to executable is /usr/sbin/haproxy
Nov 28 17:51:30 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [WARNING]  (215797) : Exiting Master process...
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.603 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [ALERT]    (215797) : Current worker (215799) exited with code 143 (Terminated)
Nov 28 17:51:30 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[215793]: [WARNING]  (215797) : All workers exited. Exiting... (0)
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.608 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 systemd[1]: libpod-ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23.scope: Deactivated successfully.
Nov 28 17:51:30 compute-0 conmon[215793]: conmon ed0a93fdcf1c560c33fa <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23.scope/container/memory.events
Nov 28 17:51:30 compute-0 podman[216164]: 2025-11-28 17:51:30.615695192 +0000 UTC m=+0.066791641 container died ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 17:51:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23-userdata-shm.mount: Deactivated successfully.
Nov 28 17:51:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-0981fe86c6f1698aa0d0c8ae4c1317fc7c43cda0bc701bb7cd9c4727c0cf0aab-merged.mount: Deactivated successfully.
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.655 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.655 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.655 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 17:51:30 compute-0 podman[216164]: 2025-11-28 17:51:30.665930796 +0000 UTC m=+0.117027245 container cleanup ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 17:51:30 compute-0 systemd[1]: libpod-conmon-ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23.scope: Deactivated successfully.
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.708 187227 DEBUG nova.virt.libvirt.guest [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4fb5c36e-4336-4f9b-a18a-182d79fc7fb1' (instance-00000013) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.708 187227 INFO nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migration operation has completed
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.708 187227 INFO nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] _post_live_migration() is started..
Nov 28 17:51:30 compute-0 podman[216208]: 2025-11-28 17:51:30.745855483 +0000 UTC m=+0.052524291 container remove ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.751 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[72b257eb-0bb0-4cbb-b3a7-26b3c1404110]: (4, ('Fri Nov 28 05:51:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23)\ned0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23\nFri Nov 28 05:51:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (ed0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23)\ned0a93fdcf1c560c33fa9949a641daa8a7c83be507ff9e5f570e784a76366b23\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.754 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4fdee07e-ff99-497f-a435-7fe8a004ce17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.756 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:51:30 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.793 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 nova_compute[187223]: 2025-11-28 17:51:30.809 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.812 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[93b8d2f5-2dc4-4b3f-9e3e-8a5f0fdbcba6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.832 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed84107-cbf9-4e41-8e75-4ba3e1f2c462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.833 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fabcf6f6-b7e6-4f35-81ad-5cfdcab02bf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.848 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbcd556-eeb9-4f96-8a0b-e4e607983c5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539609, 'reachable_time': 18318, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216227, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.852 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:51:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:30.852 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cb71d8-fb3e-4efa-bc18-ff8ad2e26ebb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: ERROR   17:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: ERROR   17:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: ERROR   17:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: ERROR   17:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: ERROR   17:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:51:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.333 187227 DEBUG nova.compute.manager [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.334 187227 DEBUG oslo_concurrency.lockutils [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.334 187227 DEBUG oslo_concurrency.lockutils [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.334 187227 DEBUG oslo_concurrency.lockutils [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.334 187227 DEBUG nova.compute.manager [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:32 compute-0 nova_compute[187223]: 2025-11-28 17:51:32.335 187227 DEBUG nova.compute.manager [req-84e9bc17-ea68-447f-8e21-75951ff78aae req-b96d9855-3107-49bf-9c33-fa0d4da7c9bb 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-unplugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:51:33 compute-0 podman[216228]: 2025-11-28 17:51:33.233131343 +0000 UTC m=+0.081403061 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 28 17:51:33 compute-0 nova_compute[187223]: 2025-11-28 17:51:33.906 187227 DEBUG nova.network.neutron [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updated VIF entry in instance network info cache for port 04f9669d-47d8-4dc0-ab98-04885312a101. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:51:33 compute-0 nova_compute[187223]: 2025-11-28 17:51:33.906 187227 DEBUG nova.network.neutron [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Updating instance_info_cache with network_info: [{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.277 187227 DEBUG oslo_concurrency.lockutils [req-8965edc9-598f-4c61-8588-05fc3a47813e req-529b3a22-d2ea-4eb9-8957-26bcb9a24d4f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-4fb5c36e-4336-4f9b-a18a-182d79fc7fb1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.426 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.529 187227 DEBUG nova.compute.manager [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.530 187227 DEBUG oslo_concurrency.lockutils [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.530 187227 DEBUG oslo_concurrency.lockutils [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.530 187227 DEBUG oslo_concurrency.lockutils [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.530 187227 DEBUG nova.compute.manager [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:34 compute-0 nova_compute[187223]: 2025-11-28 17:51:34.530 187227 WARNING nova.compute.manager [req-3ed903ad-a6b0-4ad1-ab9a-93d3e632f446 req-91c3d603-7cf6-406c-98e7-f8d815f16e69 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received unexpected event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with vm_state active and task_state migrating.
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.069 187227 DEBUG nova.network.neutron [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Activated binding for port 04f9669d-47d8-4dc0-ab98-04885312a101 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.070 187227 DEBUG nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.071 187227 DEBUG nova.virt.libvirt.vif [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:50:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1229670702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1229670702',id=19,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:50:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-yamterc6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:51:18Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=4fb5c36e-4336-4f9b-a18a-182d79fc7fb1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.071 187227 DEBUG nova.network.os_vif_util [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "04f9669d-47d8-4dc0-ab98-04885312a101", "address": "fa:16:3e:b1:be:39", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04f9669d-47", "ovs_interfaceid": "04f9669d-47d8-4dc0-ab98-04885312a101", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.072 187227 DEBUG nova.network.os_vif_util [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.072 187227 DEBUG os_vif [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.075 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.075 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04f9669d-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:51:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:35.086 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:51:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:35.087 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.117 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.119 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.123 187227 INFO os_vif [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:be:39,bridge_name='br-int',has_traffic_filtering=True,id=04f9669d-47d8-4dc0-ab98-04885312a101,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04f9669d-47')
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.124 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.125 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.126 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.126 187227 DEBUG nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.127 187227 INFO nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Deleting instance files /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1_del
Nov 28 17:51:35 compute-0 nova_compute[187223]: 2025-11-28 17:51:35.128 187227 INFO nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Deletion of /var/lib/nova/instances/4fb5c36e-4336-4f9b-a18a-182d79fc7fb1_del complete
Nov 28 17:51:35 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:51:35 compute-0 systemd[216063]: Activating special unit Exit the Session...
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped target Main User Target.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped target Basic System.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped target Paths.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped target Sockets.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped target Timers.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:51:35 compute-0 systemd[216063]: Closed D-Bus User Message Bus Socket.
Nov 28 17:51:35 compute-0 systemd[216063]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:51:35 compute-0 systemd[216063]: Removed slice User Application Slice.
Nov 28 17:51:35 compute-0 systemd[216063]: Reached target Shutdown.
Nov 28 17:51:35 compute-0 systemd[216063]: Finished Exit the Session.
Nov 28 17:51:35 compute-0 systemd[216063]: Reached target Exit the Session.
Nov 28 17:51:35 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:51:35 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:51:35 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:51:35 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:51:35 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:51:35 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:51:35 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.633 187227 DEBUG nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.633 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.634 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.634 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.635 187227 DEBUG nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.635 187227 WARNING nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received unexpected event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with vm_state active and task_state migrating.
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.635 187227 DEBUG nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.635 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.636 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.636 187227 DEBUG oslo_concurrency.lockutils [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.637 187227 DEBUG nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] No waiting events found dispatching network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:51:36 compute-0 nova_compute[187223]: 2025-11-28 17:51:36.637 187227 WARNING nova.compute.manager [req-d6c16618-897a-4da8-96ad-f1c3dd54d300 req-14db6624-e354-4f6b-93fa-3468af6a5bdd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Received unexpected event network-vif-plugged-04f9669d-47d8-4dc0-ab98-04885312a101 for instance with vm_state active and task_state migrating.
Nov 28 17:51:39 compute-0 nova_compute[187223]: 2025-11-28 17:51:39.428 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:39 compute-0 nova_compute[187223]: 2025-11-28 17:51:39.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:40 compute-0 nova_compute[187223]: 2025-11-28 17:51:40.118 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:40 compute-0 nova_compute[187223]: 2025-11-28 17:51:40.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:40 compute-0 nova_compute[187223]: 2025-11-28 17:51:40.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.132 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.132 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.132 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "4fb5c36e-4336-4f9b-a18a-182d79fc7fb1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.169 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.170 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.170 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.170 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.316 187227 WARNING nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.317 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.34150314331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.317 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.318 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.351 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration for instance 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.371 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.399 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration 1b9d403e-cfe5-4b5d-b993-796562c2b8b8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.399 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.399 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.418 187227 DEBUG nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.432 187227 DEBUG nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.433 187227 DEBUG nova.compute.provider_tree [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.447 187227 DEBUG nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.466 187227 DEBUG nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_STATUS_DISABLED,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.499 187227 DEBUG nova.compute.provider_tree [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.514 187227 DEBUG nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.537 187227 DEBUG nova.compute.resource_tracker [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.537 187227 DEBUG oslo_concurrency.lockutils [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.544 187227 INFO nova.compute.manager [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.646 187227 INFO nova.scheduler.client.report [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Deleted allocation for migration 1b9d403e-cfe5-4b5d-b993-796562c2b8b8
Nov 28 17:51:41 compute-0 nova_compute[187223]: 2025-11-28 17:51:41.646 187227 DEBUG nova.virt.libvirt.driver [None req-467cd945-5808-4976-8291-d8101956426b a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 17:51:43 compute-0 nova_compute[187223]: 2025-11-28 17:51:43.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:43 compute-0 nova_compute[187223]: 2025-11-28 17:51:43.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:44 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:51:44.090 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:51:44 compute-0 nova_compute[187223]: 2025-11-28 17:51:44.431 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:45 compute-0 nova_compute[187223]: 2025-11-28 17:51:45.120 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:45 compute-0 nova_compute[187223]: 2025-11-28 17:51:45.652 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352290.6503115, 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:51:45 compute-0 nova_compute[187223]: 2025-11-28 17:51:45.652 187227 INFO nova.compute.manager [-] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] VM Stopped (Lifecycle Event)
Nov 28 17:51:45 compute-0 nova_compute[187223]: 2025-11-28 17:51:45.674 187227 DEBUG nova.compute.manager [None req-678b2a4a-3c19-49e7-8e51-cb63170e32d2 - - - - - -] [instance: 4fb5c36e-4336-4f9b-a18a-182d79fc7fb1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:51:46 compute-0 podman[216253]: 2025-11-28 17:51:46.224667371 +0000 UTC m=+0.073058251 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.780 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.781 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.781 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.781 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.927 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.928 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.34150314331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.929 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.929 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.991 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:51:46 compute-0 nova_compute[187223]: 2025-11-28 17:51:46.991 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:51:47 compute-0 nova_compute[187223]: 2025-11-28 17:51:47.012 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:51:47 compute-0 nova_compute[187223]: 2025-11-28 17:51:47.028 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:51:47 compute-0 nova_compute[187223]: 2025-11-28 17:51:47.030 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:51:47 compute-0 nova_compute[187223]: 2025-11-28 17:51:47.030 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:51:48 compute-0 nova_compute[187223]: 2025-11-28 17:51:48.029 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:48 compute-0 nova_compute[187223]: 2025-11-28 17:51:48.030 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:51:48 compute-0 nova_compute[187223]: 2025-11-28 17:51:48.030 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:51:48 compute-0 nova_compute[187223]: 2025-11-28 17:51:48.053 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:51:49 compute-0 nova_compute[187223]: 2025-11-28 17:51:49.431 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:49 compute-0 nova_compute[187223]: 2025-11-28 17:51:49.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:49 compute-0 nova_compute[187223]: 2025-11-28 17:51:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:50 compute-0 nova_compute[187223]: 2025-11-28 17:51:50.162 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:54 compute-0 podman[216277]: 2025-11-28 17:51:54.195542301 +0000 UTC m=+0.053966524 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 17:51:54 compute-0 nova_compute[187223]: 2025-11-28 17:51:54.433 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:54 compute-0 nova_compute[187223]: 2025-11-28 17:51:54.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:51:55 compute-0 nova_compute[187223]: 2025-11-28 17:51:55.204 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:59 compute-0 nova_compute[187223]: 2025-11-28 17:51:59.435 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:51:59 compute-0 podman[197556]: time="2025-11-28T17:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:51:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:51:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Nov 28 17:52:00 compute-0 nova_compute[187223]: 2025-11-28 17:52:00.207 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:01 compute-0 podman[216296]: 2025-11-28 17:52:01.201085484 +0000 UTC m=+0.063981212 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 17:52:01 compute-0 podman[216297]: 2025-11-28 17:52:01.238038709 +0000 UTC m=+0.095935907 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: ERROR   17:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: ERROR   17:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: ERROR   17:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: ERROR   17:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: ERROR   17:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:52:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:52:04 compute-0 podman[216341]: 2025-11-28 17:52:04.199346801 +0000 UTC m=+0.062823925 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, distribution-scope=public, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 28 17:52:04 compute-0 nova_compute[187223]: 2025-11-28 17:52:04.232 187227 DEBUG nova.compute.manager [None req-20bc1333-658d-43ec-b96b-3a1d4d9352d6 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Nov 28 17:52:04 compute-0 nova_compute[187223]: 2025-11-28 17:52:04.304 187227 DEBUG nova.compute.provider_tree [None req-20bc1333-658d-43ec-b96b-3a1d4d9352d6 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Updating resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 generation from 29 to 31 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 28 17:52:04 compute-0 nova_compute[187223]: 2025-11-28 17:52:04.484 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:05 compute-0 nova_compute[187223]: 2025-11-28 17:52:05.209 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.540 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.540 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.557 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.622 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.622 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.630 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.631 187227 INFO nova.compute.claims [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.734 187227 DEBUG nova.compute.provider_tree [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.749 187227 DEBUG nova.scheduler.client.report [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.769 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.770 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.819 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.819 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.871 187227 INFO nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.901 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.993 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.994 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.994 187227 INFO nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Creating image(s)
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.995 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.995 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:07 compute-0 nova_compute[187223]: 2025-11-28 17:52:07.996 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.011 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.088 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.089 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.089 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.101 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.154 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.155 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.189 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.189 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.190 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.242 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.243 187227 DEBUG nova.virt.disk.api [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.243 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.305 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.306 187227 DEBUG nova.virt.disk.api [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.306 187227 DEBUG nova.objects.instance [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ab64e16-6b3f-4112-957c-e2f871b75da3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.323 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.323 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Ensure instance console log exists: /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.324 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.324 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.324 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:08 compute-0 nova_compute[187223]: 2025-11-28 17:52:08.898 187227 DEBUG nova.policy [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:52:09 compute-0 nova_compute[187223]: 2025-11-28 17:52:09.486 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:09 compute-0 nova_compute[187223]: 2025-11-28 17:52:09.625 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Successfully created port: 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:52:10 compute-0 nova_compute[187223]: 2025-11-28 17:52:10.211 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:11 compute-0 nova_compute[187223]: 2025-11-28 17:52:11.920 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Successfully updated port: 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:52:11 compute-0 nova_compute[187223]: 2025-11-28 17:52:11.948 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:52:11 compute-0 nova_compute[187223]: 2025-11-28 17:52:11.948 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:52:11 compute-0 nova_compute[187223]: 2025-11-28 17:52:11.949 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:52:12 compute-0 nova_compute[187223]: 2025-11-28 17:52:12.033 187227 DEBUG nova.compute.manager [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-changed-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:12 compute-0 nova_compute[187223]: 2025-11-28 17:52:12.033 187227 DEBUG nova.compute.manager [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Refreshing instance network info cache due to event network-changed-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:52:12 compute-0 nova_compute[187223]: 2025-11-28 17:52:12.033 187227 DEBUG oslo_concurrency.lockutils [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:52:12 compute-0 nova_compute[187223]: 2025-11-28 17:52:12.356 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.438 187227 DEBUG nova.network.neutron [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updating instance_info_cache with network_info: [{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.462 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.463 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Instance network_info: |[{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.463 187227 DEBUG oslo_concurrency.lockutils [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.463 187227 DEBUG nova.network.neutron [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Refreshing network info cache for port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.466 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Start _get_guest_xml network_info=[{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.471 187227 WARNING nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.476 187227 DEBUG nova.virt.libvirt.host [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.476 187227 DEBUG nova.virt.libvirt.host [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.481 187227 DEBUG nova.virt.libvirt.host [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.482 187227 DEBUG nova.virt.libvirt.host [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.484 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.484 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.484 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.485 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.485 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.485 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.485 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.485 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.486 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.486 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.486 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.486 187227 DEBUG nova.virt.hardware [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.489 187227 DEBUG nova.virt.libvirt.vif [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1043945484',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1043945484',id=21,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-7rxv1g1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:52:07Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3ab64e16-6b3f-4112-957c-e2f871b75da3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.490 187227 DEBUG nova.network.os_vif_util [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.490 187227 DEBUG nova.network.os_vif_util [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.491 187227 DEBUG nova.objects.instance [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ab64e16-6b3f-4112-957c-e2f871b75da3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.505 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <uuid>3ab64e16-6b3f-4112-957c-e2f871b75da3</uuid>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <name>instance-00000015</name>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-1043945484</nova:name>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:52:13</nova:creationTime>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         <nova:port uuid="0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da">
Nov 28 17:52:13 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <system>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="serial">3ab64e16-6b3f-4112-957c-e2f871b75da3</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="uuid">3ab64e16-6b3f-4112-957c-e2f871b75da3</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </system>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <os>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </os>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <features>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </features>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.config"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:f7:b8:d5"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <target dev="tap0ed44b8e-9b"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/console.log" append="off"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <video>
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </video>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:52:13 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:52:13 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:52:13 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:52:13 compute-0 nova_compute[187223]: </domain>
Nov 28 17:52:13 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.506 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Preparing to wait for external event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.506 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.507 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.507 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.507 187227 DEBUG nova.virt.libvirt.vif [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1043945484',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1043945484',id=21,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-7rxv1g1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:52:07Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3ab64e16-6b3f-4112-957c-e2f871b75da3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.508 187227 DEBUG nova.network.os_vif_util [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.508 187227 DEBUG nova.network.os_vif_util [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.508 187227 DEBUG os_vif [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.509 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.509 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.510 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.512 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.512 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ed44b8e-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.513 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ed44b8e-9b, col_values=(('external_ids', {'iface-id': '0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:b8:d5', 'vm-uuid': '3ab64e16-6b3f-4112-957c-e2f871b75da3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.514 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:13 compute-0 NetworkManager[55763]: <info>  [1764352333.5155] manager: (tap0ed44b8e-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.516 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.519 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.520 187227 INFO os_vif [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b')
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.563 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.564 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.564 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:f7:b8:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:52:13 compute-0 nova_compute[187223]: 2025-11-28 17:52:13.564 187227 INFO nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Using config drive
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.094 187227 INFO nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Creating config drive at /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.config
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.101 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9cayvzs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.226 187227 DEBUG oslo_concurrency.processutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9cayvzs" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:14 compute-0 kernel: tap0ed44b8e-9b: entered promiscuous mode
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.3112] manager: (tap0ed44b8e-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.310 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_controller[95574]: 2025-11-28T17:52:14Z|00159|binding|INFO|Claiming lport 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for this chassis.
Nov 28 17:52:14 compute-0 ovn_controller[95574]: 2025-11-28T17:52:14Z|00160|binding|INFO|0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da: Claiming fa:16:3e:f7:b8:d5 10.100.0.13
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.313 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.321 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:b8:d5 10.100.0.13'], port_security=['fa:16:3e:f7:b8:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3ab64e16-6b3f-4112-957c-e2f871b75da3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.322 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.323 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.328 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_controller[95574]: 2025-11-28T17:52:14Z|00161|binding|INFO|Setting lport 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da ovn-installed in OVS
Nov 28 17:52:14 compute-0 ovn_controller[95574]: 2025-11-28T17:52:14Z|00162|binding|INFO|Setting lport 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da up in Southbound
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.341 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8682d250-5daa-426b-828d-60500da14c49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.342 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.344 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.344 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[607f80d8-c9f6-4cee-af1b-77671411e439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.345 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1d25a96d-35c6-401a-b37d-d4af6e78ad4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 systemd-udevd[216396]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.356 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[9744adab-c739-4965-81c7-5254773ef884]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 systemd-machined[153517]: New machine qemu-15-instance-00000015.
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.3658] device (tap0ed44b8e-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.3670] device (tap0ed44b8e-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:52:14 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.387 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[120025ce-6eba-4335-8a9b-8f60b359a19d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.419 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[fda80e69-26bf-483e-b8fd-7a2d0d35d37d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.424 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[351d1919-91ee-4220-8dac-b971e4317dcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.4261] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.455 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f7035a-9613-48b8-8230-f1165b86148c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.458 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[68a48e73-cb98-4214-bc77-98e9494b38c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.4821] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.487 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.490 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[754467b1-9ae1-4dda-98aa-f1ed50574347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.505 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a76c9e52-8ccb-40f3-a84e-c9c88c1ce45f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550911, 'reachable_time': 20814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216429, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.519 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2d7530-180a-4b1c-b9e9-8782532434fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 550911, 'tstamp': 550911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216430, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.534 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3bfdb808-8e2e-4cd3-9e30-c718af6c4605]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550911, 'reachable_time': 20814, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216431, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.564 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[802d971a-ca08-401d-9db2-e750d5a3319a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.625 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e56707c1-b4e3-49e3-b7ed-9f5922b90a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.627 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.627 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.628 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:14 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:52:14 compute-0 NetworkManager[55763]: <info>  [1764352334.6304] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.631 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.633 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:14 compute-0 ovn_controller[95574]: 2025-11-28T17:52:14Z|00163|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.637 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.641 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.642 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c2ddd8-88c0-42aa-88f0-3303d78fe001]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.643 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:52:14 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:14.644 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.648 187227 DEBUG nova.compute.manager [req-47bab1dc-c1ec-4dc6-8b6e-14a01138ffa0 req-5b34f033-f8da-4ce4-8d83-98e5bc6369e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.649 187227 DEBUG oslo_concurrency.lockutils [req-47bab1dc-c1ec-4dc6-8b6e-14a01138ffa0 req-5b34f033-f8da-4ce4-8d83-98e5bc6369e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.649 187227 DEBUG oslo_concurrency.lockutils [req-47bab1dc-c1ec-4dc6-8b6e-14a01138ffa0 req-5b34f033-f8da-4ce4-8d83-98e5bc6369e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.649 187227 DEBUG oslo_concurrency.lockutils [req-47bab1dc-c1ec-4dc6-8b6e-14a01138ffa0 req-5b34f033-f8da-4ce4-8d83-98e5bc6369e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.650 187227 DEBUG nova.compute.manager [req-47bab1dc-c1ec-4dc6-8b6e-14a01138ffa0 req-5b34f033-f8da-4ce4-8d83-98e5bc6369e0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Processing event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.650 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.852 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.853 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352334.852922, 3ab64e16-6b3f-4112-957c-e2f871b75da3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.853 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] VM Started (Lifecycle Event)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.858 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.866 187227 INFO nova.virt.libvirt.driver [-] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Instance spawned successfully.
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.867 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.876 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.879 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.903 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.903 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.904 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.904 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.905 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.905 187227 DEBUG nova.virt.libvirt.driver [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.909 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.909 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352334.8547611, 3ab64e16-6b3f-4112-957c-e2f871b75da3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.909 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] VM Paused (Lifecycle Event)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.945 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.948 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352334.8579807, 3ab64e16-6b3f-4112-957c-e2f871b75da3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.948 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] VM Resumed (Lifecycle Event)
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.981 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.985 187227 INFO nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Took 6.99 seconds to spawn the instance on the hypervisor.
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.985 187227 DEBUG nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:14 compute-0 nova_compute[187223]: 2025-11-28 17:52:14.987 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.020 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:52:15 compute-0 podman[216469]: 2025-11-28 17:52:15.033551317 +0000 UTC m=+0.053457541 container create d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.067 187227 INFO nova.compute.manager [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Took 7.46 seconds to build instance.
Nov 28 17:52:15 compute-0 systemd[1]: Started libpod-conmon-d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649.scope.
Nov 28 17:52:15 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.099 187227 DEBUG oslo_concurrency.lockutils [None req-84854c40-949b-4b5c-b7d4-22554bd2e0a1 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:15 compute-0 podman[216469]: 2025-11-28 17:52:15.007036362 +0000 UTC m=+0.026942606 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:52:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35fc5d7f06d57642da24c2c953d51804b5aa58e72a68fc5747de5efabd3fffec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:52:15 compute-0 podman[216469]: 2025-11-28 17:52:15.129750789 +0000 UTC m=+0.149657033 container init d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:52:15 compute-0 podman[216469]: 2025-11-28 17:52:15.13605714 +0000 UTC m=+0.155963364 container start d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 17:52:15 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [NOTICE]   (216488) : New worker (216490) forked
Nov 28 17:52:15 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [NOTICE]   (216488) : Loading success.
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.273 187227 DEBUG nova.network.neutron [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updated VIF entry in instance network info cache for port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.274 187227 DEBUG nova.network.neutron [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updating instance_info_cache with network_info: [{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:52:15 compute-0 nova_compute[187223]: 2025-11-28 17:52:15.378 187227 DEBUG oslo_concurrency.lockutils [req-616db8e2-0645-40cb-8cf1-c2546a3212f3 req-85c2f46e-b4d2-41b2-9ab4-cc6e252d3908 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.717 187227 DEBUG nova.compute.manager [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.718 187227 DEBUG oslo_concurrency.lockutils [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.718 187227 DEBUG oslo_concurrency.lockutils [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.718 187227 DEBUG oslo_concurrency.lockutils [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.718 187227 DEBUG nova.compute.manager [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:16 compute-0 nova_compute[187223]: 2025-11-28 17:52:16.718 187227 WARNING nova.compute.manager [req-8559cd33-06b0-409e-9767-3b2a466fc090 req-9f667513-9f70-4360-882c-2edb57bfe4de 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state None.
Nov 28 17:52:17 compute-0 podman[216499]: 2025-11-28 17:52:17.200541865 +0000 UTC m=+0.066503023 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:52:18 compute-0 nova_compute[187223]: 2025-11-28 17:52:18.516 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:19 compute-0 nova_compute[187223]: 2025-11-28 17:52:19.489 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:23 compute-0 nova_compute[187223]: 2025-11-28 17:52:23.560 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:24 compute-0 nova_compute[187223]: 2025-11-28 17:52:24.492 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:25 compute-0 podman[216524]: 2025-11-28 17:52:25.243360498 +0000 UTC m=+0.085442106 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:52:27 compute-0 ovn_controller[95574]: 2025-11-28T17:52:27Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:b8:d5 10.100.0.13
Nov 28 17:52:27 compute-0 ovn_controller[95574]: 2025-11-28T17:52:27Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:b8:d5 10.100.0.13
Nov 28 17:52:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:27.705 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:27.707 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:28 compute-0 nova_compute[187223]: 2025-11-28 17:52:28.562 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:29 compute-0 nova_compute[187223]: 2025-11-28 17:52:29.493 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:29 compute-0 podman[197556]: time="2025-11-28T17:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:52:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:52:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3057 "" "Go-http-client/1.1"
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: ERROR   17:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: ERROR   17:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: ERROR   17:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: ERROR   17:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: ERROR   17:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:52:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:52:32 compute-0 podman[216560]: 2025-11-28 17:52:32.19455189 +0000 UTC m=+0.055728945 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 28 17:52:32 compute-0 podman[216561]: 2025-11-28 17:52:32.220589763 +0000 UTC m=+0.078417938 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 17:52:33 compute-0 nova_compute[187223]: 2025-11-28 17:52:33.565 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:34 compute-0 nova_compute[187223]: 2025-11-28 17:52:34.691 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:35 compute-0 podman[216602]: 2025-11-28 17:52:35.203731377 +0000 UTC m=+0.061762939 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Nov 28 17:52:35 compute-0 nova_compute[187223]: 2025-11-28 17:52:35.366 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Check if temp file /var/lib/nova/instances/tmp007gfhti exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 17:52:35 compute-0 nova_compute[187223]: 2025-11-28 17:52:35.366 187227 DEBUG nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp007gfhti',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ab64e16-6b3f-4112-957c-e2f871b75da3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 17:52:36 compute-0 nova_compute[187223]: 2025-11-28 17:52:36.273 187227 DEBUG oslo_concurrency.processutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:36 compute-0 nova_compute[187223]: 2025-11-28 17:52:36.365 187227 DEBUG oslo_concurrency.processutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:36 compute-0 nova_compute[187223]: 2025-11-28 17:52:36.366 187227 DEBUG oslo_concurrency.processutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:52:36 compute-0 nova_compute[187223]: 2025-11-28 17:52:36.429 187227 DEBUG oslo_concurrency.processutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:52:37 compute-0 nova_compute[187223]: 2025-11-28 17:52:37.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:37 compute-0 nova_compute[187223]: 2025-11-28 17:52:37.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:52:37 compute-0 nova_compute[187223]: 2025-11-28 17:52:37.709 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:52:38 compute-0 nova_compute[187223]: 2025-11-28 17:52:38.569 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:38 compute-0 sshd-session[216629]: Accepted publickey for nova from 192.168.122.101 port 56772 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 17:52:38 compute-0 systemd-logind[788]: New session 41 of user nova.
Nov 28 17:52:38 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 17:52:39 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 17:52:39 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 17:52:39 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 17:52:39 compute-0 systemd[216633]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:52:39 compute-0 systemd[216633]: Queued start job for default target Main User Target.
Nov 28 17:52:39 compute-0 systemd[216633]: Created slice User Application Slice.
Nov 28 17:52:39 compute-0 systemd[216633]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:52:39 compute-0 systemd[216633]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 17:52:39 compute-0 systemd[216633]: Reached target Paths.
Nov 28 17:52:39 compute-0 systemd[216633]: Reached target Timers.
Nov 28 17:52:39 compute-0 systemd[216633]: Starting D-Bus User Message Bus Socket...
Nov 28 17:52:39 compute-0 systemd[216633]: Starting Create User's Volatile Files and Directories...
Nov 28 17:52:39 compute-0 systemd[216633]: Listening on D-Bus User Message Bus Socket.
Nov 28 17:52:39 compute-0 systemd[216633]: Reached target Sockets.
Nov 28 17:52:39 compute-0 systemd[216633]: Finished Create User's Volatile Files and Directories.
Nov 28 17:52:39 compute-0 systemd[216633]: Reached target Basic System.
Nov 28 17:52:39 compute-0 systemd[216633]: Reached target Main User Target.
Nov 28 17:52:39 compute-0 systemd[216633]: Startup finished in 128ms.
Nov 28 17:52:39 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 17:52:39 compute-0 systemd[1]: Started Session 41 of User nova.
Nov 28 17:52:39 compute-0 sshd-session[216629]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 17:52:39 compute-0 sshd-session[216648]: Received disconnect from 192.168.122.101 port 56772:11: disconnected by user
Nov 28 17:52:39 compute-0 sshd-session[216648]: Disconnected from user nova 192.168.122.101 port 56772
Nov 28 17:52:39 compute-0 sshd-session[216629]: pam_unix(sshd:session): session closed for user nova
Nov 28 17:52:39 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Nov 28 17:52:39 compute-0 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Nov 28 17:52:39 compute-0 systemd-logind[788]: Removed session 41.
Nov 28 17:52:39 compute-0 nova_compute[187223]: 2025-11-28 17:52:39.694 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.107 187227 DEBUG nova.compute.manager [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.108 187227 DEBUG oslo_concurrency.lockutils [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.109 187227 DEBUG oslo_concurrency.lockutils [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.109 187227 DEBUG oslo_concurrency.lockutils [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.110 187227 DEBUG nova.compute.manager [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.110 187227 DEBUG nova.compute.manager [req-2f037359-2f37-4ce1-85d3-29d9d0aab5fc req-1a9f4eed-8a3c-4502-add7-9b84963df234 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.519 187227 INFO nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Took 4.09 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.520 187227 DEBUG nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.540 187227 DEBUG nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp007gfhti',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3ab64e16-6b3f-4112-957c-e2f871b75da3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0bc0b92b-67e3-4d1b-8552-1b33424f0f19),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.572 187227 DEBUG nova.objects.instance [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lazy-loading 'migration_context' on Instance uuid 3ab64e16-6b3f-4112-957c-e2f871b75da3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.575 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.577 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.578 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.663 187227 DEBUG nova.virt.libvirt.vif [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1043945484',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1043945484',id=21,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:52:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-7rxv1g1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:52:15Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3ab64e16-6b3f-4112-957c-e2f871b75da3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.663 187227 DEBUG nova.network.os_vif_util [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converting VIF {"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.664 187227 DEBUG nova.network.os_vif_util [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.664 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 17:52:40 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:f7:b8:d5"/>
Nov 28 17:52:40 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 17:52:40 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:52:40 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 17:52:40 compute-0 nova_compute[187223]:   <target dev="tap0ed44b8e-9b"/>
Nov 28 17:52:40 compute-0 nova_compute[187223]: </interface>
Nov 28 17:52:40 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.665 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.709 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.710 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:40 compute-0 nova_compute[187223]: 2025-11-28 17:52:40.710 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:52:41 compute-0 nova_compute[187223]: 2025-11-28 17:52:41.082 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:52:41 compute-0 nova_compute[187223]: 2025-11-28 17:52:41.083 187227 INFO nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 17:52:41 compute-0 nova_compute[187223]: 2025-11-28 17:52:41.165 187227 INFO nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 17:52:41 compute-0 nova_compute[187223]: 2025-11-28 17:52:41.668 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:52:41 compute-0 nova_compute[187223]: 2025-11-28 17:52:41.668 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.173 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.173 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.203 187227 DEBUG nova.compute.manager [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.203 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.203 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.203 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 DEBUG nova.compute.manager [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 WARNING nova.compute.manager [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state migrating.
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 DEBUG nova.compute.manager [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-changed-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 DEBUG nova.compute.manager [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Refreshing instance network info cache due to event network-changed-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.204 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.205 187227 DEBUG nova.network.neutron [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Refreshing network info cache for port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.677 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.678 187227 DEBUG nova.virt.libvirt.migration [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.775 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352362.775247, 3ab64e16-6b3f-4112-957c-e2f871b75da3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.776 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] VM Paused (Lifecycle Event)
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.800 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.806 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.822 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 17:52:42 compute-0 kernel: tap0ed44b8e-9b (unregistering): left promiscuous mode
Nov 28 17:52:42 compute-0 NetworkManager[55763]: <info>  [1764352362.9482] device (tap0ed44b8e-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.958 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:42 compute-0 ovn_controller[95574]: 2025-11-28T17:52:42Z|00164|binding|INFO|Releasing lport 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da from this chassis (sb_readonly=0)
Nov 28 17:52:42 compute-0 ovn_controller[95574]: 2025-11-28T17:52:42Z|00165|binding|INFO|Setting lport 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da down in Southbound
Nov 28 17:52:42 compute-0 ovn_controller[95574]: 2025-11-28T17:52:42Z|00166|binding|INFO|Removing iface tap0ed44b8e-9b ovn-installed in OVS
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.962 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:42.967 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:b8:d5 10.100.0.13'], port_security=['fa:16:3e:f7:b8:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3ab64e16-6b3f-4112-957c-e2f871b75da3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:52:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:42.969 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:52:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:42.971 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:52:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:42.974 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[31de9870-8610-4221-828d-8ab4bdb639ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:42 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:42.975 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:52:42 compute-0 nova_compute[187223]: 2025-11-28 17:52:42.982 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 28 17:52:43 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 13.667s CPU time.
Nov 28 17:52:43 compute-0 systemd-machined[153517]: Machine qemu-15-instance-00000015 terminated.
Nov 28 17:52:43 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [NOTICE]   (216488) : haproxy version is 2.8.14-c23fe91
Nov 28 17:52:43 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [NOTICE]   (216488) : path to executable is /usr/sbin/haproxy
Nov 28 17:52:43 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [WARNING]  (216488) : Exiting Master process...
Nov 28 17:52:43 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [ALERT]    (216488) : Current worker (216490) exited with code 143 (Terminated)
Nov 28 17:52:43 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[216484]: [WARNING]  (216488) : All workers exited. Exiting... (0)
Nov 28 17:52:43 compute-0 systemd[1]: libpod-d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649.scope: Deactivated successfully.
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.158 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 podman[216678]: 2025-11-28 17:52:43.161247278 +0000 UTC m=+0.056394910 container died d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.164 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649-userdata-shm.mount: Deactivated successfully.
Nov 28 17:52:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-35fc5d7f06d57642da24c2c953d51804b5aa58e72a68fc5747de5efabd3fffec-merged.mount: Deactivated successfully.
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.209 187227 DEBUG nova.virt.libvirt.guest [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.209 187227 INFO nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migration operation has completed
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.210 187227 INFO nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] _post_live_migration() is started..
Nov 28 17:52:43 compute-0 podman[216678]: 2025-11-28 17:52:43.211353778 +0000 UTC m=+0.106501410 container cleanup d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.216 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.216 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.217 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 17:52:43 compute-0 systemd[1]: libpod-conmon-d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649.scope: Deactivated successfully.
Nov 28 17:52:43 compute-0 podman[216725]: 2025-11-28 17:52:43.281052176 +0000 UTC m=+0.043226916 container remove d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.289 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[956b3e04-8283-43ad-8732-c0bafbc588f9]: (4, ('Fri Nov 28 05:52:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649)\nd9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649\nFri Nov 28 05:52:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (d9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649)\nd9b31e4ed7a44b41c65b302005909b224502c645c4f671d0c71073313448d649\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.292 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f53a62-b773-42f4-b800-9da78360ca74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.293 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.295 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.319 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.322 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[971b6a25-8c13-46c2-aad4-c8535f0386ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.335 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ad3cf7d5-0414-4ce7-8f9a-5898151438fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.336 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[be562617-03de-4ce9-a6a9-7c3a2984380a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.357 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[13de82f4-b956-4f5e-a1e0-25230ad7059d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 550904, 'reachable_time': 43157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216743, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.363 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:52:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:43.363 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d06972-5f92-4065-9122-d3d8914300a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.572 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:43 compute-0 nova_compute[187223]: 2025-11-28 17:52:43.717 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.066 187227 DEBUG nova.compute.manager [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.067 187227 DEBUG oslo_concurrency.lockutils [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.067 187227 DEBUG oslo_concurrency.lockutils [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.067 187227 DEBUG oslo_concurrency.lockutils [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.067 187227 DEBUG nova.compute.manager [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.068 187227 DEBUG nova.compute.manager [req-ea5cd72f-e38a-4d64-873f-28c5b1c2d68e req-00db9743-2b20-45c7-a042-11b24fed3617 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:44 compute-0 nova_compute[187223]: 2025-11-28 17:52:44.696 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.161 187227 DEBUG nova.compute.manager [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.162 187227 DEBUG oslo_concurrency.lockutils [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.162 187227 DEBUG oslo_concurrency.lockutils [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.162 187227 DEBUG oslo_concurrency.lockutils [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.163 187227 DEBUG nova.compute.manager [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.163 187227 DEBUG nova.compute.manager [req-dcf87faa-3deb-4d77-8fda-afd4712406bf req-92de9b2d-637a-45aa-a110-3faca81bc2f8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-unplugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:52:45 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:45.227 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.228 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:45 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:45.228 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.423 187227 DEBUG nova.network.neutron [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Activated binding for port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.424 187227 DEBUG nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.426 187227 DEBUG nova.virt.libvirt.vif [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1043945484',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1043945484',id=21,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:52:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-7rxv1g1s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:52:32Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=3ab64e16-6b3f-4112-957c-e2f871b75da3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.426 187227 DEBUG nova.network.os_vif_util [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converting VIF {"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.428 187227 DEBUG nova.network.os_vif_util [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.429 187227 DEBUG os_vif [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.434 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.435 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ed44b8e-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.440 187227 DEBUG nova.network.neutron [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updated VIF entry in instance network info cache for port 0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.441 187227 DEBUG nova.network.neutron [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updating instance_info_cache with network_info: [{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.443 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.446 187227 INFO os_vif [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:b8:d5,bridge_name='br-int',has_traffic_filtering=True,id=0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ed44b8e-9b')
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.447 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.447 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.447 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.447 187227 DEBUG nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.448 187227 INFO nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Deleting instance files /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3_del
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.449 187227 INFO nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Deletion of /var/lib/nova/instances/3ab64e16-6b3f-4112-957c-e2f871b75da3_del complete
Nov 28 17:52:45 compute-0 nova_compute[187223]: 2025-11-28 17:52:45.495 187227 DEBUG oslo_concurrency.lockutils [req-e9277b34-cd93-44cc-8e71-4fbf8d87c965 req-87de5325-1984-4f48-843e-c6edc5170d9e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.185 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.186 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.186 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.187 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.187 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.188 187227 WARNING nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state migrating.
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.189 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.189 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.190 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.190 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.191 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.191 187227 WARNING nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state migrating.
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.192 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.192 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.193 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.193 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.194 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.194 187227 WARNING nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state migrating.
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.195 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.195 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.196 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.196 187227 DEBUG oslo_concurrency.lockutils [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.197 187227 DEBUG nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] No waiting events found dispatching network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:52:46 compute-0 nova_compute[187223]: 2025-11-28 17:52:46.197 187227 WARNING nova.compute.manager [req-db514484-27f4-4ff0-b363-f0b440b0640d req-d8a45a1b-e021-4983-adb7-fafc5b5db3b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Received unexpected event network-vif-plugged-0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da for instance with vm_state active and task_state migrating.
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.704 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:52:47 compute-0 nova_compute[187223]: 2025-11-28 17:52:47.704 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3ab64e16-6b3f-4112-957c-e2f871b75da3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:52:48 compute-0 podman[216745]: 2025-11-28 17:52:48.187559719 +0000 UTC m=+0.048950284 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.898 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updating instance_info_cache with network_info: [{"id": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "address": "fa:16:3e:f7:b8:d5", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ed44b8e-9b", "ovs_interfaceid": "0ed44b8e-9bee-4529-9b27-1ac3c6c2b3da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.912 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-3ab64e16-6b3f-4112-957c-e2f871b75da3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.913 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.913 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.982 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.983 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.983 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:48 compute-0 nova_compute[187223]: 2025-11-28 17:52:48.984 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.154 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.156 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5860MB free_disk=73.34151077270508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.156 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.156 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.194 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Migration for instance 3ab64e16-6b3f-4112-957c-e2f871b75da3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.215 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:52:49 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 17:52:49 compute-0 systemd[216633]: Activating special unit Exit the Session...
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped target Main User Target.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped target Basic System.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped target Paths.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped target Sockets.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped target Timers.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 17:52:49 compute-0 systemd[216633]: Closed D-Bus User Message Bus Socket.
Nov 28 17:52:49 compute-0 systemd[216633]: Stopped Create User's Volatile Files and Directories.
Nov 28 17:52:49 compute-0 systemd[216633]: Removed slice User Application Slice.
Nov 28 17:52:49 compute-0 systemd[216633]: Reached target Shutdown.
Nov 28 17:52:49 compute-0 systemd[216633]: Finished Exit the Session.
Nov 28 17:52:49 compute-0 systemd[216633]: Reached target Exit the Session.
Nov 28 17:52:49 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 17:52:49 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 17:52:49 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.401 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Migration 0bc0b92b-67e3-4d1b-8552-1b33424f0f19 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.402 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.402 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:52:49 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 17:52:49 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 17:52:49 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 17:52:49 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.488 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.508 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.528 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.528 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.698 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.760 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.760 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.760 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "3ab64e16-6b3f-4112-957c-e2f871b75da3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.786 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.786 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.786 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.786 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.930 187227 WARNING nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.931 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.34149169921875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.932 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.932 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.968 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Migration for instance 3ab64e16-6b3f-4112-957c-e2f871b75da3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 17:52:49 compute-0 nova_compute[187223]: 2025-11-28 17:52:49.987 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.015 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Migration 0bc0b92b-67e3-4d1b-8552-1b33424f0f19 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.016 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.016 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.067 187227 DEBUG nova.compute.provider_tree [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.090 187227 DEBUG nova.scheduler.client.report [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.092 187227 DEBUG nova.compute.resource_tracker [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.092 187227 DEBUG oslo_concurrency.lockutils [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.098 187227 INFO nova.compute.manager [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.207 187227 INFO nova.scheduler.client.report [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Deleted allocation for migration 0bc0b92b-67e3-4d1b-8552-1b33424f0f19
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.208 187227 DEBUG nova.virt.libvirt.driver [None req-b9bfc8b5-e0dd-4757-8272-14d08c69c47f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.439 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.498 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:50 compute-0 nova_compute[187223]: 2025-11-28 17:52:50.681 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:51 compute-0 nova_compute[187223]: 2025-11-28 17:52:51.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:52 compute-0 ovn_controller[95574]: 2025-11-28T17:52:52Z|00167|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 17:52:53 compute-0 nova_compute[187223]: 2025-11-28 17:52:53.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:54 compute-0 nova_compute[187223]: 2025-11-28 17:52:54.700 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:52:54 compute-0 nova_compute[187223]: 2025-11-28 17:52:54.730 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:55 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:52:55.231 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:52:55 compute-0 nova_compute[187223]: 2025-11-28 17:52:55.440 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:56 compute-0 podman[216773]: 2025-11-28 17:52:56.215664069 +0000 UTC m=+0.068329116 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:52:58 compute-0 nova_compute[187223]: 2025-11-28 17:52:58.208 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352363.2079413, 3ab64e16-6b3f-4112-957c-e2f871b75da3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:52:58 compute-0 nova_compute[187223]: 2025-11-28 17:52:58.209 187227 INFO nova.compute.manager [-] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] VM Stopped (Lifecycle Event)
Nov 28 17:52:58 compute-0 nova_compute[187223]: 2025-11-28 17:52:58.236 187227 DEBUG nova.compute.manager [None req-60e03115-b650-4d65-82a3-66fd5b92ef50 - - - - - -] [instance: 3ab64e16-6b3f-4112-957c-e2f871b75da3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:52:59 compute-0 nova_compute[187223]: 2025-11-28 17:52:59.734 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:52:59 compute-0 podman[197556]: time="2025-11-28T17:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:52:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:52:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 28 17:53:00 compute-0 nova_compute[187223]: 2025-11-28 17:53:00.443 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: ERROR   17:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: ERROR   17:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: ERROR   17:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: ERROR   17:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: ERROR   17:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:53:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:53:02 compute-0 nova_compute[187223]: 2025-11-28 17:53:02.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:02 compute-0 nova_compute[187223]: 2025-11-28 17:53:02.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:53:03 compute-0 podman[216792]: 2025-11-28 17:53:03.215824985 +0000 UTC m=+0.063172953 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 17:53:03 compute-0 podman[216793]: 2025-11-28 17:53:03.238134799 +0000 UTC m=+0.079191237 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:53:04 compute-0 nova_compute[187223]: 2025-11-28 17:53:04.734 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:05 compute-0 nova_compute[187223]: 2025-11-28 17:53:05.883 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:06 compute-0 podman[216836]: 2025-11-28 17:53:06.207471323 +0000 UTC m=+0.068207143 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 17:53:09 compute-0 nova_compute[187223]: 2025-11-28 17:53:09.777 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:10 compute-0 nova_compute[187223]: 2025-11-28 17:53:10.886 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:14 compute-0 nova_compute[187223]: 2025-11-28 17:53:14.904 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:15 compute-0 nova_compute[187223]: 2025-11-28 17:53:15.888 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:16 compute-0 nova_compute[187223]: 2025-11-28 17:53:16.521 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:17 compute-0 sshd-session[216858]: Invalid user sol from 193.32.162.146 port 34034
Nov 28 17:53:17 compute-0 sshd-session[216858]: Connection closed by invalid user sol 193.32.162.146 port 34034 [preauth]
Nov 28 17:53:19 compute-0 podman[216860]: 2025-11-28 17:53:19.20661116 +0000 UTC m=+0.064681009 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:53:19 compute-0 nova_compute[187223]: 2025-11-28 17:53:19.906 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:20 compute-0 nova_compute[187223]: 2025-11-28 17:53:20.891 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:24 compute-0 nova_compute[187223]: 2025-11-28 17:53:24.907 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:25 compute-0 nova_compute[187223]: 2025-11-28 17:53:25.893 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:27 compute-0 podman[216886]: 2025-11-28 17:53:27.224850795 +0000 UTC m=+0.074999876 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 17:53:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:53:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:53:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:53:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:53:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:53:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:53:29 compute-0 podman[197556]: time="2025-11-28T17:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:53:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:53:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 28 17:53:29 compute-0 nova_compute[187223]: 2025-11-28 17:53:29.910 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:30 compute-0 nova_compute[187223]: 2025-11-28 17:53:30.895 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: ERROR   17:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: ERROR   17:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: ERROR   17:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: ERROR   17:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: ERROR   17:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:53:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:53:34 compute-0 podman[216905]: 2025-11-28 17:53:34.260624113 +0000 UTC m=+0.111217443 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 28 17:53:34 compute-0 podman[216906]: 2025-11-28 17:53:34.283127171 +0000 UTC m=+0.128482516 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 17:53:34 compute-0 nova_compute[187223]: 2025-11-28 17:53:34.912 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:35 compute-0 nova_compute[187223]: 2025-11-28 17:53:35.897 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:37 compute-0 podman[216953]: 2025-11-28 17:53:37.205569083 +0000 UTC m=+0.062585739 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 28 17:53:39 compute-0 nova_compute[187223]: 2025-11-28 17:53:39.914 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:40 compute-0 nova_compute[187223]: 2025-11-28 17:53:40.900 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:41 compute-0 nova_compute[187223]: 2025-11-28 17:53:41.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:41 compute-0 nova_compute[187223]: 2025-11-28 17:53:41.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:41 compute-0 nova_compute[187223]: 2025-11-28 17:53:41.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:53:43 compute-0 ovn_controller[95574]: 2025-11-28T17:53:43Z|00168|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 28 17:53:44 compute-0 nova_compute[187223]: 2025-11-28 17:53:44.916 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:45 compute-0 nova_compute[187223]: 2025-11-28 17:53:45.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:45 compute-0 nova_compute[187223]: 2025-11-28 17:53:45.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:45 compute-0 nova_compute[187223]: 2025-11-28 17:53:45.902 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:48 compute-0 nova_compute[187223]: 2025-11-28 17:53:48.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:48 compute-0 nova_compute[187223]: 2025-11-28 17:53:48.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:53:48 compute-0 nova_compute[187223]: 2025-11-28 17:53:48.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:53:48 compute-0 nova_compute[187223]: 2025-11-28 17:53:48.712 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.709 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.710 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.919 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.945 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.946 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5892MB free_disk=73.34151077270508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.947 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:53:49 compute-0 nova_compute[187223]: 2025-11-28 17:53:49.947 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.023 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.024 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.151 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.173 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.175 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.175 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:53:50 compute-0 podman[216975]: 2025-11-28 17:53:50.230116027 +0000 UTC m=+0.080499657 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:53:50 compute-0 nova_compute[187223]: 2025-11-28 17:53:50.905 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:52 compute-0 nova_compute[187223]: 2025-11-28 17:53:52.176 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:52 compute-0 nova_compute[187223]: 2025-11-28 17:53:52.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:54 compute-0 nova_compute[187223]: 2025-11-28 17:53:54.682 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:53:54 compute-0 nova_compute[187223]: 2025-11-28 17:53:54.921 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:55 compute-0 nova_compute[187223]: 2025-11-28 17:53:55.908 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:53:58 compute-0 podman[216999]: 2025-11-28 17:53:58.208488108 +0000 UTC m=+0.064861643 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 17:53:59 compute-0 podman[197556]: time="2025-11-28T17:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:53:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:53:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Nov 28 17:53:59 compute-0 nova_compute[187223]: 2025-11-28 17:53:59.924 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:00 compute-0 nova_compute[187223]: 2025-11-28 17:54:00.910 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: ERROR   17:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: ERROR   17:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: ERROR   17:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: ERROR   17:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: ERROR   17:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:54:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:54:04 compute-0 nova_compute[187223]: 2025-11-28 17:54:04.926 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:05 compute-0 podman[217018]: 2025-11-28 17:54:05.204580747 +0000 UTC m=+0.059552037 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 17:54:05 compute-0 podman[217019]: 2025-11-28 17:54:05.248426406 +0000 UTC m=+0.091344797 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:54:05 compute-0 nova_compute[187223]: 2025-11-28 17:54:05.913 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:08 compute-0 nova_compute[187223]: 2025-11-28 17:54:08.156 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:08 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:08.156 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:54:08 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:08.158 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:54:08 compute-0 podman[217061]: 2025-11-28 17:54:08.206959881 +0000 UTC m=+0.070283283 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 17:54:09 compute-0 nova_compute[187223]: 2025-11-28 17:54:09.928 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:10 compute-0 nova_compute[187223]: 2025-11-28 17:54:10.915 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:12.160 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:14 compute-0 nova_compute[187223]: 2025-11-28 17:54:14.929 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:15 compute-0 nova_compute[187223]: 2025-11-28 17:54:15.918 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:19 compute-0 nova_compute[187223]: 2025-11-28 17:54:19.932 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:20 compute-0 nova_compute[187223]: 2025-11-28 17:54:20.924 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:21 compute-0 podman[217081]: 2025-11-28 17:54:21.19858391 +0000 UTC m=+0.054113257 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.419 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.419 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.447 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.546 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.547 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.560 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.561 187227 INFO nova.compute.claims [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.682 187227 DEBUG nova.compute.provider_tree [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.697 187227 DEBUG nova.scheduler.client.report [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.722 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.723 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.766 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.766 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.785 187227 INFO nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.801 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.880 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.882 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.882 187227 INFO nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Creating image(s)
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.883 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.883 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.883 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.896 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.986 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.987 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.988 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:23 compute-0 nova_compute[187223]: 2025-11-28 17:54:23.999 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.071 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.073 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.103 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.104 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.105 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.134 187227 DEBUG nova.policy [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40bca16232f3471c8094a414f8874e9a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.154 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.155 187227 DEBUG nova.virt.disk.api [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Checking if we can resize image /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.155 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.210 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.211 187227 DEBUG nova.virt.disk.api [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Cannot resize image /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.211 187227 DEBUG nova.objects.instance [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 483838f6-db84-40e5-a623-242e763d4feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.230 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.231 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Ensure instance console log exists: /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.231 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.231 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.231 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:24 compute-0 nova_compute[187223]: 2025-11-28 17:54:24.935 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:25 compute-0 nova_compute[187223]: 2025-11-28 17:54:25.280 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Successfully created port: 5d766eeb-6df1-461c-b40b-6552738e5458 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:54:25 compute-0 nova_compute[187223]: 2025-11-28 17:54:25.926 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.417 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Successfully updated port: 5d766eeb-6df1-461c-b40b-6552738e5458 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.463 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.463 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquired lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.463 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.530 187227 DEBUG nova.compute.manager [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-changed-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.531 187227 DEBUG nova.compute.manager [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Refreshing instance network info cache due to event network-changed-5d766eeb-6df1-461c-b40b-6552738e5458. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.531 187227 DEBUG oslo_concurrency.lockutils [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:54:26 compute-0 nova_compute[187223]: 2025-11-28 17:54:26.602 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:54:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:27.706 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:27.707 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.823 187227 DEBUG nova.network.neutron [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updating instance_info_cache with network_info: [{"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.860 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Releasing lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.861 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Instance network_info: |[{"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.861 187227 DEBUG oslo_concurrency.lockutils [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.861 187227 DEBUG nova.network.neutron [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Refreshing network info cache for port 5d766eeb-6df1-461c-b40b-6552738e5458 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.863 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Start _get_guest_xml network_info=[{"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.868 187227 WARNING nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.874 187227 DEBUG nova.virt.libvirt.host [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.874 187227 DEBUG nova.virt.libvirt.host [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.879 187227 DEBUG nova.virt.libvirt.host [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.879 187227 DEBUG nova.virt.libvirt.host [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.881 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.881 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.882 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.882 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.882 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.883 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.883 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.883 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.883 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.884 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.884 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.884 187227 DEBUG nova.virt.hardware [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.888 187227 DEBUG nova.virt.libvirt.vif [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:54:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1423921278',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1423921278',id=24,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-otqdraez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:54:23Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=483838f6-db84-40e5-a623-242e763d4feb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.888 187227 DEBUG nova.network.os_vif_util [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.889 187227 DEBUG nova.network.os_vif_util [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.890 187227 DEBUG nova.objects.instance [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 483838f6-db84-40e5-a623-242e763d4feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.903 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <uuid>483838f6-db84-40e5-a623-242e763d4feb</uuid>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <name>instance-00000018</name>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteStrategies-server-1423921278</nova:name>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:54:27</nova:creationTime>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:user uuid="40bca16232f3471c8094a414f8874e9a">tempest-TestExecuteStrategies-384316604-project-member</nova:user>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:project uuid="f987f40adf1f46018ab0ca81b8d954f6">tempest-TestExecuteStrategies-384316604</nova:project>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         <nova:port uuid="5d766eeb-6df1-461c-b40b-6552738e5458">
Nov 28 17:54:27 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <system>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="serial">483838f6-db84-40e5-a623-242e763d4feb</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="uuid">483838f6-db84-40e5-a623-242e763d4feb</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </system>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <os>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </os>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <features>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </features>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.config"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:1f:75:aa"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <target dev="tap5d766eeb-6d"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/console.log" append="off"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <video>
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </video>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:54:27 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:54:27 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:54:27 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:54:27 compute-0 nova_compute[187223]: </domain>
Nov 28 17:54:27 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.904 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Preparing to wait for external event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.904 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.904 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.905 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.905 187227 DEBUG nova.virt.libvirt.vif [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:54:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1423921278',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1423921278',id=24,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-otqdraez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:54:23Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=483838f6-db84-40e5-a623-242e763d4feb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.906 187227 DEBUG nova.network.os_vif_util [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.906 187227 DEBUG nova.network.os_vif_util [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.906 187227 DEBUG os_vif [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.907 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.907 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.908 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.910 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.910 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d766eeb-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.911 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d766eeb-6d, col_values=(('external_ids', {'iface-id': '5d766eeb-6df1-461c-b40b-6552738e5458', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:75:aa', 'vm-uuid': '483838f6-db84-40e5-a623-242e763d4feb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:27 compute-0 NetworkManager[55763]: <info>  [1764352467.9301] manager: (tap5d766eeb-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.928 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.932 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.936 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:27 compute-0 nova_compute[187223]: 2025-11-28 17:54:27.936 187227 INFO os_vif [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d')
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.020 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.020 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.020 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] No VIF found with MAC fa:16:3e:1f:75:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.021 187227 INFO nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Using config drive
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.405 187227 INFO nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Creating config drive at /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.config
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.410 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnceaam1c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.533 187227 DEBUG oslo_concurrency.processutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnceaam1c" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:28 compute-0 kernel: tap5d766eeb-6d: entered promiscuous mode
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.6074] manager: (tap5d766eeb-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 28 17:54:28 compute-0 ovn_controller[95574]: 2025-11-28T17:54:28Z|00169|binding|INFO|Claiming lport 5d766eeb-6df1-461c-b40b-6552738e5458 for this chassis.
Nov 28 17:54:28 compute-0 ovn_controller[95574]: 2025-11-28T17:54:28Z|00170|binding|INFO|5d766eeb-6df1-461c-b40b-6552738e5458: Claiming fa:16:3e:1f:75:aa 10.100.0.8
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.606 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 ovn_controller[95574]: 2025-11-28T17:54:28Z|00171|binding|INFO|Setting lport 5d766eeb-6df1-461c-b40b-6552738e5458 up in Southbound
Nov 28 17:54:28 compute-0 ovn_controller[95574]: 2025-11-28T17:54:28Z|00172|binding|INFO|Setting lport 5d766eeb-6df1-461c-b40b-6552738e5458 ovn-installed in OVS
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.620 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.621 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.619 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:75:aa 10.100.0.8'], port_security=['fa:16:3e:1f:75:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '483838f6-db84-40e5-a623-242e763d4feb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=5d766eeb-6df1-461c-b40b-6552738e5458) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.621 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 5d766eeb-6df1-461c-b40b-6552738e5458 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.623 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.625 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.633 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3f025a5f-5795-485b-b19b-1aa935ae8387]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.636 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7710a7d0-31 in ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:54:28 compute-0 systemd-udevd[217149]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.638 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7710a7d0-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.638 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[35b872da-fb7b-4312-8ecb-fc20d6696017]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.639 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[799af2d4-19ab-42c5-9fb9-fdae54552137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 systemd-machined[153517]: New machine qemu-16-instance-00000018.
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.6509] device (tap5d766eeb-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.6523] device (tap5d766eeb-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.651 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[03674dd9-22fc-426d-931b-1909e28d43c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000018.
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.665 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[54356ba8-70ed-4ecb-b4f0-2bf4fec972e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 podman[217132]: 2025-11-28 17:54:28.666512842 +0000 UTC m=+0.064555366 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.691 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[8214dc0e-4efd-4f29-8f29-d1db944579ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.696 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a5cf171d-fcbf-4294-ae62-875e3f25fbc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.6971] manager: (tap7710a7d0-30): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Nov 28 17:54:28 compute-0 systemd-udevd[217158]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.722 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[f78e4911-1551-4221-ab4f-3989228c9429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.725 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[291ef071-dcfe-41af-a8ad-1853a73fe33e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.7424] device (tap7710a7d0-30): carrier: link connected
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.747 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b0e6ac-15ea-4c39-8ff8-a2a49838d4a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.764 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2413d1-e414-4ca8-9880-451e039bc095]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564337, 'reachable_time': 34349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217189, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.780 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[637d7748-8fbb-46ae-9890-e2ab9ceb0a3c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7e:b99f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564337, 'tstamp': 564337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217190, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.796 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[094a5aba-65c2-4870-9fb8-64515687c825]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564337, 'reachable_time': 34349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217191, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.821 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9b0fba-2c74-42a3-83b0-d6d425cfc54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.831 187227 DEBUG nova.compute.manager [req-76c20dc4-120d-4e8a-939a-542b1a0b281a req-d4741b9f-e5d6-47b0-998c-d322c397ad76 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.832 187227 DEBUG oslo_concurrency.lockutils [req-76c20dc4-120d-4e8a-939a-542b1a0b281a req-d4741b9f-e5d6-47b0-998c-d322c397ad76 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.832 187227 DEBUG oslo_concurrency.lockutils [req-76c20dc4-120d-4e8a-939a-542b1a0b281a req-d4741b9f-e5d6-47b0-998c-d322c397ad76 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.832 187227 DEBUG oslo_concurrency.lockutils [req-76c20dc4-120d-4e8a-939a-542b1a0b281a req-d4741b9f-e5d6-47b0-998c-d322c397ad76 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.832 187227 DEBUG nova.compute.manager [req-76c20dc4-120d-4e8a-939a-542b1a0b281a req-d4741b9f-e5d6-47b0-998c-d322c397ad76 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Processing event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.883 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[511038f3-ecd0-4f9b-bd72-a84d38949c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.884 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.884 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.884 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.886 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 kernel: tap7710a7d0-30: entered promiscuous mode
Nov 28 17:54:28 compute-0 NetworkManager[55763]: <info>  [1764352468.8882] manager: (tap7710a7d0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.890 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:54:28 compute-0 ovn_controller[95574]: 2025-11-28T17:54:28Z|00173|binding|INFO|Releasing lport bc789832-2d4b-4b14-95c2-e30a740a3a6b from this chassis (sb_readonly=0)
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.891 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.894 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.895 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[805c274b-a9ec-4c28-9ce1-16d8548a3fba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.896 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/7710a7d0-31b3-4473-89c4-40533fdd6e7d.pid.haproxy
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:54:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:54:28.897 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'env', 'PROCESS_TAG=haproxy-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7710a7d0-31b3-4473-89c4-40533fdd6e7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.903 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.918 187227 DEBUG nova.network.neutron [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updated VIF entry in instance network info cache for port 5d766eeb-6df1-461c-b40b-6552738e5458. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.918 187227 DEBUG nova.network.neutron [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updating instance_info_cache with network_info: [{"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.935 187227 DEBUG oslo_concurrency.lockutils [req-910bca04-c569-477b-873f-f99f9b6b993d req-8274cf0e-fca6-4dda-a0e5-4148e9167510 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.942 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352468.9416409, 483838f6-db84-40e5-a623-242e763d4feb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.942 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] VM Started (Lifecycle Event)
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.944 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.947 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.950 187227 INFO nova.virt.libvirt.driver [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Instance spawned successfully.
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.950 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.965 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.969 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.972 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.972 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.972 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.973 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.973 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:28 compute-0 nova_compute[187223]: 2025-11-28 17:54:28.973 187227 DEBUG nova.virt.libvirt.driver [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.076 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.076 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352468.9438589, 483838f6-db84-40e5-a623-242e763d4feb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.076 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] VM Paused (Lifecycle Event)
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.106 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.109 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352468.9460957, 483838f6-db84-40e5-a623-242e763d4feb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.109 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] VM Resumed (Lifecycle Event)
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.126 187227 INFO nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Took 5.25 seconds to spawn the instance on the hypervisor.
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.126 187227 DEBUG nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.163 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.166 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.206 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.224 187227 INFO nova.compute.manager [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Took 5.73 seconds to build instance.
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.246 187227 DEBUG oslo_concurrency.lockutils [None req-2d6019c2-df22-41f0-8edf-b37d01e423c8 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:29 compute-0 podman[217228]: 2025-11-28 17:54:29.281138504 +0000 UTC m=+0.051340120 container create 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 17:54:29 compute-0 systemd[1]: Started libpod-conmon-9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569.scope.
Nov 28 17:54:29 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:54:29 compute-0 podman[217228]: 2025-11-28 17:54:29.251690699 +0000 UTC m=+0.021892335 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:54:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67fe6d84e4a42a07208eed7734351f007b412da73f1bd34e7771867da2a8c455/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:54:29 compute-0 podman[217228]: 2025-11-28 17:54:29.365991545 +0000 UTC m=+0.136193181 container init 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 17:54:29 compute-0 podman[217228]: 2025-11-28 17:54:29.370852501 +0000 UTC m=+0.141054127 container start 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:54:29 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [NOTICE]   (217247) : New worker (217249) forked
Nov 28 17:54:29 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [NOTICE]   (217247) : Loading success.
Nov 28 17:54:29 compute-0 podman[197556]: time="2025-11-28T17:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:54:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:54:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3056 "" "Go-http-client/1.1"
Nov 28 17:54:29 compute-0 nova_compute[187223]: 2025-11-28 17:54:29.937 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.910 187227 DEBUG nova.compute.manager [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.910 187227 DEBUG oslo_concurrency.lockutils [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.910 187227 DEBUG oslo_concurrency.lockutils [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.911 187227 DEBUG oslo_concurrency.lockutils [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.911 187227 DEBUG nova.compute.manager [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] No waiting events found dispatching network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:54:30 compute-0 nova_compute[187223]: 2025-11-28 17:54:30.911 187227 WARNING nova.compute.manager [req-d80e4668-66cc-4b12-ac4d-69118fa48dee req-ff200f1b-b3da-4ac0-bd18-5f81fff99bc4 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received unexpected event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 for instance with vm_state active and task_state None.
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: ERROR   17:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: ERROR   17:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: ERROR   17:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: ERROR   17:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: ERROR   17:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:54:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:54:32 compute-0 nova_compute[187223]: 2025-11-28 17:54:32.980 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:34 compute-0 nova_compute[187223]: 2025-11-28 17:54:34.939 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:36 compute-0 podman[217258]: 2025-11-28 17:54:36.214639152 +0000 UTC m=+0.071128853 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 17:54:36 compute-0 podman[217259]: 2025-11-28 17:54:36.238702698 +0000 UTC m=+0.091299756 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 17:54:37 compute-0 nova_compute[187223]: 2025-11-28 17:54:37.982 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:39 compute-0 podman[217300]: 2025-11-28 17:54:39.195108673 +0000 UTC m=+0.054916375 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, version=9.6, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 17:54:39 compute-0 nova_compute[187223]: 2025-11-28 17:54:39.942 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:42 compute-0 ovn_controller[95574]: 2025-11-28T17:54:42Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:75:aa 10.100.0.8
Nov 28 17:54:42 compute-0 ovn_controller[95574]: 2025-11-28T17:54:42Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:75:aa 10.100.0.8
Nov 28 17:54:42 compute-0 nova_compute[187223]: 2025-11-28 17:54:42.984 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:43 compute-0 nova_compute[187223]: 2025-11-28 17:54:43.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:43 compute-0 nova_compute[187223]: 2025-11-28 17:54:43.699 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:43 compute-0 nova_compute[187223]: 2025-11-28 17:54:43.699 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:43 compute-0 nova_compute[187223]: 2025-11-28 17:54:43.699 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:54:44 compute-0 nova_compute[187223]: 2025-11-28 17:54:44.985 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:46 compute-0 nova_compute[187223]: 2025-11-28 17:54:46.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:46 compute-0 nova_compute[187223]: 2025-11-28 17:54:46.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:47 compute-0 nova_compute[187223]: 2025-11-28 17:54:47.986 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:49 compute-0 nova_compute[187223]: 2025-11-28 17:54:49.988 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:50 compute-0 nova_compute[187223]: 2025-11-28 17:54:50.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:50 compute-0 nova_compute[187223]: 2025-11-28 17:54:50.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:54:50 compute-0 nova_compute[187223]: 2025-11-28 17:54:50.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:54:51 compute-0 nova_compute[187223]: 2025-11-28 17:54:51.123 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:54:51 compute-0 nova_compute[187223]: 2025-11-28 17:54:51.123 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:54:51 compute-0 nova_compute[187223]: 2025-11-28 17:54:51.123 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:54:51 compute-0 nova_compute[187223]: 2025-11-28 17:54:51.123 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 483838f6-db84-40e5-a623-242e763d4feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:54:52 compute-0 podman[217343]: 2025-11-28 17:54:52.214898693 +0000 UTC m=+0.078603719 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.614 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updating instance_info_cache with network_info: [{"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.634 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-483838f6-db84-40e5-a623-242e763d4feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.634 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.634 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.654 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.654 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.654 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.655 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.716 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.789 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.790 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.861 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:54:52 compute-0 nova_compute[187223]: 2025-11-28 17:54:52.988 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.045 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.047 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.31277084350586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.048 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.049 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.156 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 483838f6-db84-40e5-a623-242e763d4feb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.157 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.157 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.195 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.215 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.245 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.246 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:54:53 compute-0 nova_compute[187223]: 2025-11-28 17:54:53.295 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:54 compute-0 nova_compute[187223]: 2025-11-28 17:54:54.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:55 compute-0 nova_compute[187223]: 2025-11-28 17:54:55.028 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:55 compute-0 nova_compute[187223]: 2025-11-28 17:54:55.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:54:57 compute-0 nova_compute[187223]: 2025-11-28 17:54:57.992 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:54:58 compute-0 ovn_controller[95574]: 2025-11-28T17:54:58Z|00174|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Nov 28 17:54:59 compute-0 podman[217373]: 2025-11-28 17:54:59.193940076 +0000 UTC m=+0.058908938 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:54:59 compute-0 podman[197556]: time="2025-11-28T17:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:54:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:54:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3063 "" "Go-http-client/1.1"
Nov 28 17:55:00 compute-0 nova_compute[187223]: 2025-11-28 17:55:00.031 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: ERROR   17:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: ERROR   17:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: ERROR   17:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: ERROR   17:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: ERROR   17:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:55:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:55:03 compute-0 nova_compute[187223]: 2025-11-28 17:55:03.024 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:05 compute-0 nova_compute[187223]: 2025-11-28 17:55:05.033 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:07 compute-0 podman[217392]: 2025-11-28 17:55:07.277764435 +0000 UTC m=+0.138616908 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:55:07 compute-0 podman[217393]: 2025-11-28 17:55:07.278348161 +0000 UTC m=+0.136275150 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:55:08 compute-0 nova_compute[187223]: 2025-11-28 17:55:08.026 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:10 compute-0 nova_compute[187223]: 2025-11-28 17:55:10.069 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:10 compute-0 podman[217437]: 2025-11-28 17:55:10.228052254 +0000 UTC m=+0.090214300 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:55:12 compute-0 nova_compute[187223]: 2025-11-28 17:55:12.192 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Creating tmpfile /var/lib/nova/instances/tmpxjwh7i6d to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:55:12 compute-0 nova_compute[187223]: 2025-11-28 17:55:12.193 187227 DEBUG nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxjwh7i6d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:55:13 compute-0 nova_compute[187223]: 2025-11-28 17:55:13.031 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:13 compute-0 nova_compute[187223]: 2025-11-28 17:55:13.793 187227 DEBUG nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxjwh7i6d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='47555dcf-fdd9-4b13-9d89-6c1d5bad67f2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:55:13 compute-0 nova_compute[187223]: 2025-11-28 17:55:13.825 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:55:13 compute-0 nova_compute[187223]: 2025-11-28 17:55:13.826 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:55:13 compute-0 nova_compute[187223]: 2025-11-28 17:55:13.826 187227 DEBUG nova.network.neutron [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.071 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.159 187227 DEBUG nova.network.neutron [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Updating instance_info_cache with network_info: [{"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.175 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.178 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxjwh7i6d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='47555dcf-fdd9-4b13-9d89-6c1d5bad67f2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.179 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Creating instance directory: /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.179 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Creating disk.info with the contents: {'/var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk': 'qcow2', '/var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.180 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.182 187227 DEBUG nova.objects.instance [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.209 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.266 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.267 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.268 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.283 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.340 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.342 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.378 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.379 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.380 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.436 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.438 187227 DEBUG nova.virt.disk.api [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.439 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.504 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.506 187227 DEBUG nova.virt.disk.api [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.506 187227 DEBUG nova.objects.instance [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.537 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.564 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.567 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config to /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:55:15 compute-0 nova_compute[187223]: 2025-11-28 17:55:15.568 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.088 187227 DEBUG oslo_concurrency.processutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2/disk.config /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.089 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.090 187227 DEBUG nova.virt.libvirt.vif [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:54:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1673754372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1673754372',id=23,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:54:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-cpxxvpdl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:54:12Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=47555dcf-fdd9-4b13-9d89-6c1d5bad67f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.090 187227 DEBUG nova.network.os_vif_util [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.091 187227 DEBUG nova.network.os_vif_util [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.091 187227 DEBUG os_vif [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.092 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.092 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.093 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.095 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.095 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc078831e-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.095 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc078831e-eb, col_values=(('external_ids', {'iface-id': 'c078831e-eb1e-4052-871e-e28c19f0f56f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:a0:5d', 'vm-uuid': '47555dcf-fdd9-4b13-9d89-6c1d5bad67f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.097 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:16 compute-0 NetworkManager[55763]: <info>  [1764352516.0991] manager: (tapc078831e-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.099 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.106 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.107 187227 INFO os_vif [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb')
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.108 187227 DEBUG nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.108 187227 DEBUG nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxjwh7i6d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='47555dcf-fdd9-4b13-9d89-6c1d5bad67f2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.631 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:16.631 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:55:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:16.633 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:55:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:16.636 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.943 187227 DEBUG nova.network.neutron [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Port c078831e-eb1e-4052-871e-e28c19f0f56f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:55:16 compute-0 nova_compute[187223]: 2025-11-28 17:55:16.945 187227 DEBUG nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxjwh7i6d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='47555dcf-fdd9-4b13-9d89-6c1d5bad67f2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:55:17 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:55:17 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:55:17 compute-0 kernel: tapc078831e-eb: entered promiscuous mode
Nov 28 17:55:17 compute-0 ovn_controller[95574]: 2025-11-28T17:55:17Z|00175|binding|INFO|Claiming lport c078831e-eb1e-4052-871e-e28c19f0f56f for this additional chassis.
Nov 28 17:55:17 compute-0 ovn_controller[95574]: 2025-11-28T17:55:17Z|00176|binding|INFO|c078831e-eb1e-4052-871e-e28c19f0f56f: Claiming fa:16:3e:f6:a0:5d 10.100.0.4
Nov 28 17:55:17 compute-0 nova_compute[187223]: 2025-11-28 17:55:17.246 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:17 compute-0 NetworkManager[55763]: <info>  [1764352517.2476] manager: (tapc078831e-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 28 17:55:17 compute-0 ovn_controller[95574]: 2025-11-28T17:55:17Z|00177|binding|INFO|Setting lport c078831e-eb1e-4052-871e-e28c19f0f56f ovn-installed in OVS
Nov 28 17:55:17 compute-0 nova_compute[187223]: 2025-11-28 17:55:17.261 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:17 compute-0 nova_compute[187223]: 2025-11-28 17:55:17.263 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:17 compute-0 nova_compute[187223]: 2025-11-28 17:55:17.268 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:17 compute-0 systemd-udevd[217514]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:55:17 compute-0 systemd-machined[153517]: New machine qemu-17-instance-00000017.
Nov 28 17:55:17 compute-0 NetworkManager[55763]: <info>  [1764352517.2950] device (tapc078831e-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:55:17 compute-0 NetworkManager[55763]: <info>  [1764352517.2961] device (tapc078831e-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:55:17 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Nov 28 17:55:18 compute-0 nova_compute[187223]: 2025-11-28 17:55:18.561 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352518.5611773, 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:55:18 compute-0 nova_compute[187223]: 2025-11-28 17:55:18.561 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] VM Started (Lifecycle Event)
Nov 28 17:55:18 compute-0 nova_compute[187223]: 2025-11-28 17:55:18.594 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:55:19 compute-0 nova_compute[187223]: 2025-11-28 17:55:19.324 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352519.3243933, 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:55:19 compute-0 nova_compute[187223]: 2025-11-28 17:55:19.325 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] VM Resumed (Lifecycle Event)
Nov 28 17:55:19 compute-0 nova_compute[187223]: 2025-11-28 17:55:19.351 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:55:19 compute-0 nova_compute[187223]: 2025-11-28 17:55:19.355 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:55:19 compute-0 nova_compute[187223]: 2025-11-28 17:55:19.382 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.076 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:20 compute-0 ovn_controller[95574]: 2025-11-28T17:55:20Z|00178|binding|INFO|Claiming lport c078831e-eb1e-4052-871e-e28c19f0f56f for this chassis.
Nov 28 17:55:20 compute-0 ovn_controller[95574]: 2025-11-28T17:55:20Z|00179|binding|INFO|c078831e-eb1e-4052-871e-e28c19f0f56f: Claiming fa:16:3e:f6:a0:5d 10.100.0.4
Nov 28 17:55:20 compute-0 ovn_controller[95574]: 2025-11-28T17:55:20Z|00180|binding|INFO|Setting lport c078831e-eb1e-4052-871e-e28c19f0f56f up in Southbound
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.418 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:a0:5d 10.100.0.4'], port_security=['fa:16:3e:f6:a0:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47555dcf-fdd9-4b13-9d89-6c1d5bad67f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '11', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=c078831e-eb1e-4052-871e-e28c19f0f56f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.419 104433 INFO neutron.agent.ovn.metadata.agent [-] Port c078831e-eb1e-4052-871e-e28c19f0f56f in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d bound to our chassis
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.422 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.441 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d106be8e-4df1-4399-a88f-3dc03b0c7acf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.476 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[4c52c895-3728-47b5-836d-72f6a859a472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.481 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a4c83c-917d-4adf-8dc4-4a8e379961ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.512 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[6050ac9c-76ba-4839-8d32-a6a05fd10e17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.528 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f511937f-3e8a-4e35-bd11-b154fa57f251]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564337, 'reachable_time': 34349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217546, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.544 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1a022d41-06ae-401b-87f1-9c09c3cdb72a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564348, 'tstamp': 564348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217547, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564350, 'tstamp': 564350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217547, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.546 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.548 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.550 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.551 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.551 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:20.552 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.608 187227 INFO nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Post operation of migration started
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.818 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.819 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:55:20 compute-0 nova_compute[187223]: 2025-11-28 17:55:20.820 187227 DEBUG nova.network.neutron [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:55:21 compute-0 nova_compute[187223]: 2025-11-28 17:55:21.152 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:21 compute-0 nova_compute[187223]: 2025-11-28 17:55:21.966 187227 DEBUG nova.network.neutron [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Updating instance_info_cache with network_info: [{"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:55:21 compute-0 nova_compute[187223]: 2025-11-28 17:55:21.989 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:55:22 compute-0 nova_compute[187223]: 2025-11-28 17:55:22.034 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:22 compute-0 nova_compute[187223]: 2025-11-28 17:55:22.035 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:22 compute-0 nova_compute[187223]: 2025-11-28 17:55:22.035 187227 DEBUG oslo_concurrency.lockutils [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:22 compute-0 nova_compute[187223]: 2025-11-28 17:55:22.043 187227 INFO nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:55:22 compute-0 virtqemud[186845]: Domain id=17 name='instance-00000017' uuid=47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 is tainted: custom-monitor
Nov 28 17:55:23 compute-0 nova_compute[187223]: 2025-11-28 17:55:23.054 187227 INFO nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:55:23 compute-0 podman[217548]: 2025-11-28 17:55:23.252974 +0000 UTC m=+0.098328530 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:55:24 compute-0 nova_compute[187223]: 2025-11-28 17:55:24.062 187227 INFO nova.virt.libvirt.driver [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:55:24 compute-0 nova_compute[187223]: 2025-11-28 17:55:24.069 187227 DEBUG nova.compute.manager [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:55:24 compute-0 nova_compute[187223]: 2025-11-28 17:55:24.093 187227 DEBUG nova.objects.instance [None req-16605185-ee5c-4a26-b86a-65aea54cece2 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:55:25 compute-0 nova_compute[187223]: 2025-11-28 17:55:25.078 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:26 compute-0 nova_compute[187223]: 2025-11-28 17:55:26.155 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:27.707 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:27.708 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:27.708 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.477 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.478 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.478 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.478 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.478 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.479 187227 INFO nova.compute.manager [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Terminating instance
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.480 187227 DEBUG nova.compute.manager [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:55:28 compute-0 kernel: tap5d766eeb-6d (unregistering): left promiscuous mode
Nov 28 17:55:28 compute-0 NetworkManager[55763]: <info>  [1764352528.5068] device (tap5d766eeb-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:55:28 compute-0 ovn_controller[95574]: 2025-11-28T17:55:28Z|00181|binding|INFO|Releasing lport 5d766eeb-6df1-461c-b40b-6552738e5458 from this chassis (sb_readonly=0)
Nov 28 17:55:28 compute-0 ovn_controller[95574]: 2025-11-28T17:55:28Z|00182|binding|INFO|Setting lport 5d766eeb-6df1-461c-b40b-6552738e5458 down in Southbound
Nov 28 17:55:28 compute-0 ovn_controller[95574]: 2025-11-28T17:55:28Z|00183|binding|INFO|Removing iface tap5d766eeb-6d ovn-installed in OVS
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.517 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.519 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.528 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:75:aa 10.100.0.8'], port_security=['fa:16:3e:1f:75:aa 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '483838f6-db84-40e5-a623-242e763d4feb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=5d766eeb-6df1-461c-b40b-6552738e5458) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.529 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 5d766eeb-6df1-461c-b40b-6552738e5458 in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.530 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.531 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.545 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7adc8ef4-93c7-4440-910b-fbb0bf333e24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000018.scope: Deactivated successfully.
Nov 28 17:55:28 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000018.scope: Consumed 14.320s CPU time.
Nov 28 17:55:28 compute-0 systemd-machined[153517]: Machine qemu-16-instance-00000018 terminated.
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.576 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[78fef166-f350-49f7-947e-bd826e5372a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.579 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b45b3c-6558-4095-92a7-f151f25922bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.618 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[70a63baa-91c2-48a3-b411-83a15066b0af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.640 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3a363d-0a6d-46f2-9e6c-d30d29e55a6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7710a7d0-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7e:b9:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564337, 'reachable_time': 34349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217585, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.659 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[252d00db-a212-4003-b6ce-a6456d92e324]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564348, 'tstamp': 564348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217586, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7710a7d0-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564350, 'tstamp': 564350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217586, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.661 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.663 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.670 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.670 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7710a7d0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.670 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.671 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7710a7d0-30, col_values=(('external_ids', {'iface-id': 'bc789832-2d4b-4b14-95c2-e30a740a3a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:28 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:28.671 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.771 187227 INFO nova.virt.libvirt.driver [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Instance destroyed successfully.
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.772 187227 DEBUG nova.objects.instance [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'resources' on Instance uuid 483838f6-db84-40e5-a623-242e763d4feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.794 187227 DEBUG nova.virt.libvirt.vif [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:54:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1423921278',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1423921278',id=24,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:54:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-otqdraez',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:54:29Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=483838f6-db84-40e5-a623-242e763d4feb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.795 187227 DEBUG nova.network.os_vif_util [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "5d766eeb-6df1-461c-b40b-6552738e5458", "address": "fa:16:3e:1f:75:aa", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d766eeb-6d", "ovs_interfaceid": "5d766eeb-6df1-461c-b40b-6552738e5458", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.795 187227 DEBUG nova.network.os_vif_util [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.796 187227 DEBUG os_vif [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.798 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.798 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d766eeb-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.800 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.801 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.806 187227 INFO os_vif [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:75:aa,bridge_name='br-int',has_traffic_filtering=True,id=5d766eeb-6df1-461c-b40b-6552738e5458,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d766eeb-6d')
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.806 187227 INFO nova.virt.libvirt.driver [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Deleting instance files /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb_del
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.807 187227 INFO nova.virt.libvirt.driver [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Deletion of /var/lib/nova/instances/483838f6-db84-40e5-a623-242e763d4feb_del complete
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.878 187227 INFO nova.compute.manager [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Took 0.40 seconds to destroy the instance on the hypervisor.
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.878 187227 DEBUG oslo.service.loopingcall [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.879 187227 DEBUG nova.compute.manager [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:55:28 compute-0 nova_compute[187223]: 2025-11-28 17:55:28.879 187227 DEBUG nova.network.neutron [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.297 187227 DEBUG nova.compute.manager [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-unplugged-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.297 187227 DEBUG oslo_concurrency.lockutils [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.298 187227 DEBUG oslo_concurrency.lockutils [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.298 187227 DEBUG oslo_concurrency.lockutils [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.298 187227 DEBUG nova.compute.manager [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] No waiting events found dispatching network-vif-unplugged-5d766eeb-6df1-461c-b40b-6552738e5458 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:55:29 compute-0 nova_compute[187223]: 2025-11-28 17:55:29.299 187227 DEBUG nova.compute.manager [req-9c9dcf30-2fb7-4973-b730-7225c9891e6a req-a43e030a-e5f7-4cb3-8353-936af7bb8153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-unplugged-5d766eeb-6df1-461c-b40b-6552738e5458 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:55:29 compute-0 podman[197556]: time="2025-11-28T17:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:55:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:55:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.080 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.177 187227 DEBUG nova.network.neutron [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.196 187227 INFO nova.compute.manager [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Took 1.32 seconds to deallocate network for instance.
Nov 28 17:55:30 compute-0 podman[217605]: 2025-11-28 17:55:30.203615706 +0000 UTC m=+0.060300168 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.256 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.257 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.265 187227 DEBUG nova.compute.manager [req-65719448-470a-4196-8e64-c228511b84cd req-b632fa48-c658-47b2-8796-2faac1ff41dd 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-deleted-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.346 187227 DEBUG nova.compute.provider_tree [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.362 187227 DEBUG nova.scheduler.client.report [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.383 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.406 187227 INFO nova.scheduler.client.report [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Deleted allocations for instance 483838f6-db84-40e5-a623-242e763d4feb
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.466 187227 DEBUG oslo_concurrency.lockutils [None req-4f66245e-53e0-42d5-85ae-62a56ca9d485 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.807 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.807 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.807 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.807 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.807 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.808 187227 INFO nova.compute.manager [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Terminating instance
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.809 187227 DEBUG nova.compute.manager [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:55:30 compute-0 kernel: tapc078831e-eb (unregistering): left promiscuous mode
Nov 28 17:55:30 compute-0 NetworkManager[55763]: <info>  [1764352530.8339] device (tapc078831e-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:55:30 compute-0 ovn_controller[95574]: 2025-11-28T17:55:30Z|00184|binding|INFO|Releasing lport c078831e-eb1e-4052-871e-e28c19f0f56f from this chassis (sb_readonly=0)
Nov 28 17:55:30 compute-0 ovn_controller[95574]: 2025-11-28T17:55:30Z|00185|binding|INFO|Setting lport c078831e-eb1e-4052-871e-e28c19f0f56f down in Southbound
Nov 28 17:55:30 compute-0 ovn_controller[95574]: 2025-11-28T17:55:30Z|00186|binding|INFO|Removing iface tapc078831e-eb ovn-installed in OVS
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.838 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.840 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:30.863 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:a0:5d 10.100.0.4'], port_security=['fa:16:3e:f6:a0:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '47555dcf-fdd9-4b13-9d89-6c1d5bad67f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f987f40adf1f46018ab0ca81b8d954f6', 'neutron:revision_number': '13', 'neutron:security_group_ids': '2454196c-3f33-40c4-b65e-ff5ce51cee25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8b91a17c-2725-4cda-baa8-a6987e8e4bba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=c078831e-eb1e-4052-871e-e28c19f0f56f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:55:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:30.865 104433 INFO neutron.agent.ovn.metadata.agent [-] Port c078831e-eb1e-4052-871e-e28c19f0f56f in datapath 7710a7d0-31b3-4473-89c4-40533fdd6e7d unbound from our chassis
Nov 28 17:55:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:30.866 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7710a7d0-31b3-4473-89c4-40533fdd6e7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:55:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:30.868 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[63d502f1-f14b-49aa-8f63-4e02c72284d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:30 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:30.868 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d namespace which is not needed anymore
Nov 28 17:55:30 compute-0 nova_compute[187223]: 2025-11-28 17:55:30.871 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:30 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 28 17:55:30 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 2.377s CPU time.
Nov 28 17:55:30 compute-0 systemd-machined[153517]: Machine qemu-17-instance-00000017 terminated.
Nov 28 17:55:31 compute-0 NetworkManager[55763]: <info>  [1764352531.0338] manager: (tapc078831e-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [NOTICE]   (217247) : haproxy version is 2.8.14-c23fe91
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [NOTICE]   (217247) : path to executable is /usr/sbin/haproxy
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [WARNING]  (217247) : Exiting Master process...
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [WARNING]  (217247) : Exiting Master process...
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [ALERT]    (217247) : Current worker (217249) exited with code 143 (Terminated)
Nov 28 17:55:31 compute-0 neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d[217243]: [WARNING]  (217247) : All workers exited. Exiting... (0)
Nov 28 17:55:31 compute-0 systemd[1]: libpod-9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569.scope: Deactivated successfully.
Nov 28 17:55:31 compute-0 podman[217646]: 2025-11-28 17:55:31.059889896 +0000 UTC m=+0.065720332 container died 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:55:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569-userdata-shm.mount: Deactivated successfully.
Nov 28 17:55:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-67fe6d84e4a42a07208eed7734351f007b412da73f1bd34e7771867da2a8c455-merged.mount: Deactivated successfully.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.092 187227 INFO nova.virt.libvirt.driver [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Instance destroyed successfully.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.093 187227 DEBUG nova.objects.instance [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lazy-loading 'resources' on Instance uuid 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:55:31 compute-0 podman[217646]: 2025-11-28 17:55:31.096161449 +0000 UTC m=+0.101991885 container cleanup 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 17:55:31 compute-0 systemd[1]: libpod-conmon-9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569.scope: Deactivated successfully.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.117 187227 DEBUG nova.virt.libvirt.vif [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:54:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1673754372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1673754372',id=23,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:54:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f987f40adf1f46018ab0ca81b8d954f6',ramdisk_id='',reservation_id='r-cpxxvpdl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-384316604',owner_user_name='tempest-TestExecuteStrategies-384316604-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:55:24Z,user_data=None,user_id='40bca16232f3471c8094a414f8874e9a',uuid=47555dcf-fdd9-4b13-9d89-6c1d5bad67f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.118 187227 DEBUG nova.network.os_vif_util [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converting VIF {"id": "c078831e-eb1e-4052-871e-e28c19f0f56f", "address": "fa:16:3e:f6:a0:5d", "network": {"id": "7710a7d0-31b3-4473-89c4-40533fdd6e7d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1077506511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f987f40adf1f46018ab0ca81b8d954f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc078831e-eb", "ovs_interfaceid": "c078831e-eb1e-4052-871e-e28c19f0f56f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.119 187227 DEBUG nova.network.os_vif_util [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.119 187227 DEBUG os_vif [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.121 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.121 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc078831e-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.122 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.124 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.126 187227 INFO os_vif [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:a0:5d,bridge_name='br-int',has_traffic_filtering=True,id=c078831e-eb1e-4052-871e-e28c19f0f56f,network=Network(7710a7d0-31b3-4473-89c4-40533fdd6e7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc078831e-eb')
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.127 187227 INFO nova.virt.libvirt.driver [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Deleting instance files /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2_del
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.127 187227 INFO nova.virt.libvirt.driver [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Deletion of /var/lib/nova/instances/47555dcf-fdd9-4b13-9d89-6c1d5bad67f2_del complete
Nov 28 17:55:31 compute-0 podman[217693]: 2025-11-28 17:55:31.161583161 +0000 UTC m=+0.043597172 container remove 9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.166 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[0587023e-22cf-43a0-bde9-3d022426e4e7]: (4, ('Fri Nov 28 05:55:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569)\n9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569\nFri Nov 28 05:55:31 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d (9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569)\n9def1c45b49768736aa251d7d5763d6cde9d84ac67db05681fc7418061b1d569\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.168 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ffa8be-5a6a-4129-bb3b-38844ecf8113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.169 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7710a7d0-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.171 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:31 compute-0 kernel: tap7710a7d0-30: left promiscuous mode
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.175 187227 INFO nova.compute.manager [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.176 187227 DEBUG oslo.service.loopingcall [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.176 187227 DEBUG nova.compute.manager [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.177 187227 DEBUG nova.network.neutron [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.184 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.189 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[221852ac-71e7-46c8-91fb-dc8c65aa2aa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.201 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[57aed8a1-a851-4cc0-8405-6725f558cedd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.203 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2131eb68-7f49-420a-831b-b774f2ab0755]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.218 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b093d16f-e501-4d3e-850a-8326b82230ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564331, 'reachable_time': 23971, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217708, 'error': None, 'target': 'ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d7710a7d0\x2d31b3\x2d4473\x2d89c4\x2d40533fdd6e7d.mount: Deactivated successfully.
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.222 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7710a7d0-31b3-4473-89c4-40533fdd6e7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:55:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:55:31.223 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e1e247-14d5-40b1-bf93-fdfe2eb22896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: ERROR   17:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: ERROR   17:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: ERROR   17:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: ERROR   17:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: ERROR   17:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:55:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.417 187227 DEBUG nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.417 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "483838f6-db84-40e5-a623-242e763d4feb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.418 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.419 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "483838f6-db84-40e5-a623-242e763d4feb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.420 187227 DEBUG nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] No waiting events found dispatching network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.420 187227 WARNING nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Received unexpected event network-vif-plugged-5d766eeb-6df1-461c-b40b-6552738e5458 for instance with vm_state deleted and task_state None.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.421 187227 DEBUG nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Received event network-vif-unplugged-c078831e-eb1e-4052-871e-e28c19f0f56f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.422 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.422 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.423 187227 DEBUG oslo_concurrency.lockutils [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.424 187227 DEBUG nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] No waiting events found dispatching network-vif-unplugged-c078831e-eb1e-4052-871e-e28c19f0f56f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.424 187227 DEBUG nova.compute.manager [req-df6cbd76-e6ae-4953-a797-4c2bb4fba44f req-38a8f989-761e-4c70-80e3-9fb6e452591c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Received event network-vif-unplugged-c078831e-eb1e-4052-871e-e28c19f0f56f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.641 187227 DEBUG nova.network.neutron [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.662 187227 INFO nova.compute.manager [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Took 0.49 seconds to deallocate network for instance.
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.712 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.713 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.719 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.744 187227 INFO nova.scheduler.client.report [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Deleted allocations for instance 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2
Nov 28 17:55:31 compute-0 nova_compute[187223]: 2025-11-28 17:55:31.812 187227 DEBUG oslo_concurrency.lockutils [None req-bdb62e5f-779b-4209-bf8f-8849630e9c58 40bca16232f3471c8094a414f8874e9a f987f40adf1f46018ab0ca81b8d954f6 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:32 compute-0 nova_compute[187223]: 2025-11-28 17:55:32.369 187227 DEBUG nova.compute.manager [req-9a921d18-e1b7-4fb1-aabb-8c79a5bad7bf req-f7e744d0-5bc2-48fb-ada3-a90541d489a3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Received event network-vif-deleted-c078831e-eb1e-4052-871e-e28c19f0f56f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.513 187227 DEBUG nova.compute.manager [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Received event network-vif-plugged-c078831e-eb1e-4052-871e-e28c19f0f56f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.513 187227 DEBUG oslo_concurrency.lockutils [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.513 187227 DEBUG oslo_concurrency.lockutils [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.514 187227 DEBUG oslo_concurrency.lockutils [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "47555dcf-fdd9-4b13-9d89-6c1d5bad67f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.514 187227 DEBUG nova.compute.manager [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] No waiting events found dispatching network-vif-plugged-c078831e-eb1e-4052-871e-e28c19f0f56f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:55:33 compute-0 nova_compute[187223]: 2025-11-28 17:55:33.514 187227 WARNING nova.compute.manager [req-365db18f-e901-4974-b522-9d8665de1c54 req-d35b2e69-58ae-4283-b9d8-1269e231407e 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Received unexpected event network-vif-plugged-c078831e-eb1e-4052-871e-e28c19f0f56f for instance with vm_state deleted and task_state None.
Nov 28 17:55:35 compute-0 nova_compute[187223]: 2025-11-28 17:55:35.127 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:36 compute-0 nova_compute[187223]: 2025-11-28 17:55:36.124 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:38 compute-0 podman[217710]: 2025-11-28 17:55:38.210201677 +0000 UTC m=+0.070846538 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:55:38 compute-0 podman[217711]: 2025-11-28 17:55:38.241989112 +0000 UTC m=+0.095371247 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:55:40 compute-0 nova_compute[187223]: 2025-11-28 17:55:40.130 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:41 compute-0 nova_compute[187223]: 2025-11-28 17:55:41.127 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:41 compute-0 podman[217756]: 2025-11-28 17:55:41.209050708 +0000 UTC m=+0.075015348 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public)
Nov 28 17:55:43 compute-0 nova_compute[187223]: 2025-11-28 17:55:43.768 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352528.7673993, 483838f6-db84-40e5-a623-242e763d4feb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:55:43 compute-0 nova_compute[187223]: 2025-11-28 17:55:43.769 187227 INFO nova.compute.manager [-] [instance: 483838f6-db84-40e5-a623-242e763d4feb] VM Stopped (Lifecycle Event)
Nov 28 17:55:43 compute-0 nova_compute[187223]: 2025-11-28 17:55:43.792 187227 DEBUG nova.compute.manager [None req-942d6008-d65e-4885-9639-44d2c9438a1e - - - - - -] [instance: 483838f6-db84-40e5-a623-242e763d4feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:55:45 compute-0 nova_compute[187223]: 2025-11-28 17:55:45.136 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:45 compute-0 nova_compute[187223]: 2025-11-28 17:55:45.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:45 compute-0 nova_compute[187223]: 2025-11-28 17:55:45.686 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:45 compute-0 nova_compute[187223]: 2025-11-28 17:55:45.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:55:46 compute-0 nova_compute[187223]: 2025-11-28 17:55:46.088 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352531.0849438, 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:55:46 compute-0 nova_compute[187223]: 2025-11-28 17:55:46.088 187227 INFO nova.compute.manager [-] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] VM Stopped (Lifecycle Event)
Nov 28 17:55:46 compute-0 nova_compute[187223]: 2025-11-28 17:55:46.115 187227 DEBUG nova.compute.manager [None req-c619edcc-2fed-4f84-bcdb-2ac1cab6f101 - - - - - -] [instance: 47555dcf-fdd9-4b13-9d89-6c1d5bad67f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:55:46 compute-0 nova_compute[187223]: 2025-11-28 17:55:46.129 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:46 compute-0 nova_compute[187223]: 2025-11-28 17:55:46.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:48 compute-0 nova_compute[187223]: 2025-11-28 17:55:48.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:50 compute-0 nova_compute[187223]: 2025-11-28 17:55:50.139 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.132 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.701 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.733 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.734 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.734 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.735 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.925 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.926 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5873MB free_disk=73.34142303466797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.926 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.926 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.976 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:55:51 compute-0 nova_compute[187223]: 2025-11-28 17:55:51.976 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:55:52 compute-0 nova_compute[187223]: 2025-11-28 17:55:52.019 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:55:52 compute-0 nova_compute[187223]: 2025-11-28 17:55:52.037 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:55:52 compute-0 nova_compute[187223]: 2025-11-28 17:55:52.052 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:55:52 compute-0 nova_compute[187223]: 2025-11-28 17:55:52.052 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:55:54 compute-0 podman[217779]: 2025-11-28 17:55:54.186880218 +0000 UTC m=+0.049693587 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:55:55 compute-0 nova_compute[187223]: 2025-11-28 17:55:55.034 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:55 compute-0 nova_compute[187223]: 2025-11-28 17:55:55.035 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:55 compute-0 nova_compute[187223]: 2025-11-28 17:55:55.146 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:55 compute-0 nova_compute[187223]: 2025-11-28 17:55:55.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:55:56 compute-0 nova_compute[187223]: 2025-11-28 17:55:56.134 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:55:59 compute-0 podman[197556]: time="2025-11-28T17:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:55:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:55:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Nov 28 17:56:00 compute-0 nova_compute[187223]: 2025-11-28 17:56:00.149 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:01 compute-0 nova_compute[187223]: 2025-11-28 17:56:01.135 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:01 compute-0 podman[217805]: 2025-11-28 17:56:01.228114172 +0000 UTC m=+0.076922341 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: ERROR   17:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: ERROR   17:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: ERROR   17:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: ERROR   17:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: ERROR   17:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:56:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:56:01 compute-0 ovn_controller[95574]: 2025-11-28T17:56:01Z|00187|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 17:56:05 compute-0 nova_compute[187223]: 2025-11-28 17:56:05.151 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:06 compute-0 nova_compute[187223]: 2025-11-28 17:56:06.137 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:09 compute-0 podman[217824]: 2025-11-28 17:56:09.194840762 +0000 UTC m=+0.056184394 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:56:09 compute-0 podman[217825]: 2025-11-28 17:56:09.221683367 +0000 UTC m=+0.083077270 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 17:56:10 compute-0 nova_compute[187223]: 2025-11-28 17:56:10.153 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:10 compute-0 nova_compute[187223]: 2025-11-28 17:56:10.652 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:11 compute-0 nova_compute[187223]: 2025-11-28 17:56:11.140 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:12 compute-0 podman[217871]: 2025-11-28 17:56:12.220726663 +0000 UTC m=+0.082018000 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 17:56:15 compute-0 nova_compute[187223]: 2025-11-28 17:56:15.155 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:16 compute-0 nova_compute[187223]: 2025-11-28 17:56:16.193 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:18.266 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:56:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:18.267 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:56:18 compute-0 nova_compute[187223]: 2025-11-28 17:56:18.267 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:20 compute-0 nova_compute[187223]: 2025-11-28 17:56:20.158 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:20 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:20.270 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:56:21 compute-0 nova_compute[187223]: 2025-11-28 17:56:21.196 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:21 compute-0 sshd-session[217892]: Invalid user sol from 193.32.162.146 port 53090
Nov 28 17:56:22 compute-0 sshd-session[217892]: Connection closed by invalid user sol 193.32.162.146 port 53090 [preauth]
Nov 28 17:56:25 compute-0 nova_compute[187223]: 2025-11-28 17:56:25.161 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:25 compute-0 podman[217894]: 2025-11-28 17:56:25.217517933 +0000 UTC m=+0.072669460 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 17:56:26 compute-0 nova_compute[187223]: 2025-11-28 17:56:26.199 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:27.707 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:56:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:27.708 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:56:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:56:27.708 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:56:29 compute-0 podman[197556]: time="2025-11-28T17:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:56:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:56:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Nov 28 17:56:30 compute-0 nova_compute[187223]: 2025-11-28 17:56:30.163 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:31 compute-0 nova_compute[187223]: 2025-11-28 17:56:31.202 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: ERROR   17:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: ERROR   17:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: ERROR   17:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: ERROR   17:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: ERROR   17:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:56:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:56:32 compute-0 podman[217918]: 2025-11-28 17:56:32.183863229 +0000 UTC m=+0.050025216 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 28 17:56:35 compute-0 nova_compute[187223]: 2025-11-28 17:56:35.165 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:36 compute-0 nova_compute[187223]: 2025-11-28 17:56:36.204 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:40 compute-0 nova_compute[187223]: 2025-11-28 17:56:40.168 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:40 compute-0 podman[217937]: 2025-11-28 17:56:40.276386128 +0000 UTC m=+0.128336857 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 28 17:56:40 compute-0 podman[217938]: 2025-11-28 17:56:40.297732755 +0000 UTC m=+0.145216025 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:56:41 compute-0 nova_compute[187223]: 2025-11-28 17:56:41.207 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:43 compute-0 podman[217982]: 2025-11-28 17:56:43.216525753 +0000 UTC m=+0.078570130 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Nov 28 17:56:45 compute-0 nova_compute[187223]: 2025-11-28 17:56:45.213 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:45 compute-0 nova_compute[187223]: 2025-11-28 17:56:45.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:45 compute-0 nova_compute[187223]: 2025-11-28 17:56:45.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:45 compute-0 nova_compute[187223]: 2025-11-28 17:56:45.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:56:45 compute-0 ovn_controller[95574]: 2025-11-28T17:56:45Z|00188|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 17:56:46 compute-0 nova_compute[187223]: 2025-11-28 17:56:46.209 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:46 compute-0 nova_compute[187223]: 2025-11-28 17:56:46.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:47 compute-0 nova_compute[187223]: 2025-11-28 17:56:47.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:48 compute-0 nova_compute[187223]: 2025-11-28 17:56:48.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:50 compute-0 nova_compute[187223]: 2025-11-28 17:56:50.217 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:51 compute-0 nova_compute[187223]: 2025-11-28 17:56:51.213 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.686 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.706 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.707 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.730 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.731 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.731 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.731 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.893 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.894 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5875MB free_disk=73.34147262573242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.895 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.895 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.949 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.950 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.965 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.984 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 17:56:53 compute-0 nova_compute[187223]: 2025-11-28 17:56:53.985 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.004 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.027 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.056 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.070 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.072 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:56:54 compute-0 nova_compute[187223]: 2025-11-28 17:56:54.072 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:56:55 compute-0 nova_compute[187223]: 2025-11-28 17:56:55.048 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:55 compute-0 nova_compute[187223]: 2025-11-28 17:56:55.219 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:55 compute-0 sshd-session[218004]: Invalid user sol from 193.32.162.145 port 36988
Nov 28 17:56:55 compute-0 nova_compute[187223]: 2025-11-28 17:56:55.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:55 compute-0 nova_compute[187223]: 2025-11-28 17:56:55.682 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:56:55 compute-0 podman[218006]: 2025-11-28 17:56:55.754582136 +0000 UTC m=+0.080566998 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:56:55 compute-0 sshd-session[218004]: Connection closed by invalid user sol 193.32.162.145 port 36988 [preauth]
Nov 28 17:56:56 compute-0 nova_compute[187223]: 2025-11-28 17:56:56.216 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:56:59 compute-0 podman[197556]: time="2025-11-28T17:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:56:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:56:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 28 17:57:00 compute-0 nova_compute[187223]: 2025-11-28 17:57:00.224 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:01 compute-0 nova_compute[187223]: 2025-11-28 17:57:01.219 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: ERROR   17:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: ERROR   17:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: ERROR   17:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: ERROR   17:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: ERROR   17:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:57:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:57:03 compute-0 podman[218030]: 2025-11-28 17:57:03.220989633 +0000 UTC m=+0.081248677 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:57:05 compute-0 nova_compute[187223]: 2025-11-28 17:57:05.229 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:06 compute-0 nova_compute[187223]: 2025-11-28 17:57:06.222 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.505 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.506 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.528 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.615 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.616 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.622 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.622 187227 INFO nova.compute.claims [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.705 187227 DEBUG nova.compute.provider_tree [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.718 187227 DEBUG nova.scheduler.client.report [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.737 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.738 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.792 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.793 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.832 187227 INFO nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.847 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.941 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.943 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.943 187227 INFO nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Creating image(s)
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.944 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.944 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.945 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:09 compute-0 nova_compute[187223]: 2025-11-28 17:57:09.956 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.011 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.012 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.013 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.023 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.074 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.075 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.105 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.106 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.107 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.157 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.158 187227 DEBUG nova.virt.disk.api [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Checking if we can resize image /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.159 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.210 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.211 187227 DEBUG nova.virt.disk.api [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Cannot resize image /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.212 187227 DEBUG nova.objects.instance [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lazy-loading 'migration_context' on Instance uuid 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.216 187227 DEBUG nova.policy [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0be3599a4e1a4dc2bece41c2543e4bc9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd29c5331f43848f995592402975b88d1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.231 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.234 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.234 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Ensure instance console log exists: /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.234 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.234 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.235 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:10 compute-0 nova_compute[187223]: 2025-11-28 17:57:10.793 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Successfully created port: 3868291d-fb0c-4bb9-94d3-897a190444b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:57:11 compute-0 podman[218065]: 2025-11-28 17:57:11.203079143 +0000 UTC m=+0.063003720 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.224 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:11 compute-0 podman[218066]: 2025-11-28 17:57:11.268466892 +0000 UTC m=+0.119310947 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.419 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Successfully updated port: 3868291d-fb0c-4bb9-94d3-897a190444b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.433 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.434 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquired lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.434 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.504 187227 DEBUG nova.compute.manager [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-changed-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.505 187227 DEBUG nova.compute.manager [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Refreshing instance network info cache due to event network-changed-3868291d-fb0c-4bb9-94d3-897a190444b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.505 187227 DEBUG oslo_concurrency.lockutils [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:57:11 compute-0 nova_compute[187223]: 2025-11-28 17:57:11.566 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.224 187227 DEBUG nova.network.neutron [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updating instance_info_cache with network_info: [{"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.244 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Releasing lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.244 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Instance network_info: |[{"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.245 187227 DEBUG oslo_concurrency.lockutils [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.245 187227 DEBUG nova.network.neutron [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Refreshing network info cache for port 3868291d-fb0c-4bb9-94d3-897a190444b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.248 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Start _get_guest_xml network_info=[{"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.252 187227 WARNING nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.257 187227 DEBUG nova.virt.libvirt.host [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.257 187227 DEBUG nova.virt.libvirt.host [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.261 187227 DEBUG nova.virt.libvirt.host [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.261 187227 DEBUG nova.virt.libvirt.host [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.263 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.263 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.263 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.263 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.264 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.264 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.264 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.264 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.264 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.265 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.265 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.265 187227 DEBUG nova.virt.hardware [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.269 187227 DEBUG nova.virt.libvirt.vif [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:57:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1751850922',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1751850922',id=26,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d29c5331f43848f995592402975b88d1',ramdisk_id='',reservation_id='r-dr437yks',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:57:09Z,user_data=None,user_id='0be3599a4e1a4dc2bece41c2543e4bc9',uuid=3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.269 187227 DEBUG nova.network.os_vif_util [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converting VIF {"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.270 187227 DEBUG nova.network.os_vif_util [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.271 187227 DEBUG nova.objects.instance [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.285 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <uuid>3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c</uuid>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <name>instance-0000001a</name>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1751850922</nova:name>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:57:13</nova:creationTime>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:user uuid="0be3599a4e1a4dc2bece41c2543e4bc9">tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member</nova:user>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:project uuid="d29c5331f43848f995592402975b88d1">tempest-TestExecuteVmWorkloadBalanceStrategy-480471442</nova:project>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         <nova:port uuid="3868291d-fb0c-4bb9-94d3-897a190444b7">
Nov 28 17:57:13 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <system>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="serial">3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="uuid">3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </system>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <os>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </os>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <features>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </features>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.config"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:7b:aa:57"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <target dev="tap3868291d-fb"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/console.log" append="off"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <video>
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </video>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:57:13 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:57:13 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:57:13 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:57:13 compute-0 nova_compute[187223]: </domain>
Nov 28 17:57:13 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.285 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Preparing to wait for external event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.285 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.286 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.286 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.286 187227 DEBUG nova.virt.libvirt.vif [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:57:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1751850922',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1751850922',id=26,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d29c5331f43848f995592402975b88d1',ramdisk_id='',reservation_id='r-dr437yks',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:57:09Z,user_data=None,user_id='0be3599a4e1a4dc2bece41c2543e4bc9',uuid=3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.286 187227 DEBUG nova.network.os_vif_util [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converting VIF {"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.287 187227 DEBUG nova.network.os_vif_util [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.287 187227 DEBUG os_vif [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.288 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.288 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.288 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.291 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.291 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3868291d-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.292 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3868291d-fb, col_values=(('external_ids', {'iface-id': '3868291d-fb0c-4bb9-94d3-897a190444b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:aa:57', 'vm-uuid': '3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.293 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:13 compute-0 NetworkManager[55763]: <info>  [1764352633.2946] manager: (tap3868291d-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.295 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.300 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.301 187227 INFO os_vif [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb')
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.349 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.349 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.349 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] No VIF found with MAC fa:16:3e:7b:aa:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:57:13 compute-0 nova_compute[187223]: 2025-11-28 17:57:13.350 187227 INFO nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Using config drive
Nov 28 17:57:14 compute-0 podman[218114]: 2025-11-28 17:57:14.201419449 +0000 UTC m=+0.060553520 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.234 187227 INFO nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Creating config drive at /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.config
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.242 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp19fj2jmt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.270 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.378 187227 DEBUG oslo_concurrency.processutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp19fj2jmt" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:15 compute-0 kernel: tap3868291d-fb: entered promiscuous mode
Nov 28 17:57:15 compute-0 ovn_controller[95574]: 2025-11-28T17:57:15Z|00189|binding|INFO|Claiming lport 3868291d-fb0c-4bb9-94d3-897a190444b7 for this chassis.
Nov 28 17:57:15 compute-0 ovn_controller[95574]: 2025-11-28T17:57:15Z|00190|binding|INFO|3868291d-fb0c-4bb9-94d3-897a190444b7: Claiming fa:16:3e:7b:aa:57 10.100.0.13
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.463 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.4652] manager: (tap3868291d-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.466 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 systemd-udevd[218153]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:57:15 compute-0 systemd-machined[153517]: New machine qemu-18-instance-0000001a.
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.5125] device (tap3868291d-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.5138] device (tap3868291d-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.519 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001a.
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.525 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 ovn_controller[95574]: 2025-11-28T17:57:15Z|00191|binding|INFO|Setting lport 3868291d-fb0c-4bb9-94d3-897a190444b7 ovn-installed in OVS
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.531 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.565 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:aa:57 10.100.0.13'], port_security=['fa:16:3e:7b:aa:57 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd29c5331f43848f995592402975b88d1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4a4ddb51-0ed6-4517-9c9f-37fff06e30e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9230971-b926-4b1d-9b28-cbd943422cad, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=3868291d-fb0c-4bb9-94d3-897a190444b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.566 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 3868291d-fb0c-4bb9-94d3-897a190444b7 in datapath 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 bound to our chassis
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.568 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4
Nov 28 17:57:15 compute-0 ovn_controller[95574]: 2025-11-28T17:57:15Z|00192|binding|INFO|Setting lport 3868291d-fb0c-4bb9-94d3-897a190444b7 up in Southbound
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.582 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6b312c47-04c3-45a3-b6f3-f4e1250e4118]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.583 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2ff1e6ce-b1 in ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.585 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2ff1e6ce-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.585 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f48132f5-8b2f-4f24-99cd-e021cf81f4fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.586 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5c3968f9-9b3a-491f-b355-ad21c27ca08d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.608 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[689e6161-7471-49f1-99c9-837eb9be274a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.633 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3e66318c-82e3-4969-98d3-cfc79a1c8c71]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.660 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[8b299753-1268-45ad-ad9d-f905cae73520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.6671] manager: (tap2ff1e6ce-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.667 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7477e254-26e7-4af7-944c-327e19672249]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.695 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[1c990568-eb40-4362-9ec8-ef67b36c5158]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.698 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7a43f9-5ec1-4c8a-8f12-776bf197e17b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.7209] device (tap2ff1e6ce-b0): carrier: link connected
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.725 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[95af15b9-ae80-44a7-89dd-a84ccb50da3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.741 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[4649b1f5-73af-4a2f-bdb0-f212b864c4b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ff1e6ce-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:92:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581035, 'reachable_time': 15569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218187, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.756 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[36340ca9-d505-4eb0-b403-1aac7ee597a3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:92f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581035, 'tstamp': 581035}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218188, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.774 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb92a3d-54ca-43d1-98c2-49c2e7d98971]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ff1e6ce-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:92:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581035, 'reachable_time': 15569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218189, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.807 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f83dbdef-575d-4c4e-aab0-d59d1bb370ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.862 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[104f056e-1e5f-4c95-bb88-cf4fbb5041f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.863 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ff1e6ce-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.864 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.864 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ff1e6ce-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.913 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 NetworkManager[55763]: <info>  [1764352635.9135] manager: (tap2ff1e6ce-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 28 17:57:15 compute-0 kernel: tap2ff1e6ce-b0: entered promiscuous mode
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.916 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ff1e6ce-b0, col_values=(('external_ids', {'iface-id': 'c44344cc-bddd-480f-8312-6b31c21f97df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.917 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.918 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.919 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2ff1e6ce-b114-4644-bacd-9ad17a33b1a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2ff1e6ce-b114-4644-bacd-9ad17a33b1a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:57:15 compute-0 ovn_controller[95574]: 2025-11-28T17:57:15Z|00193|binding|INFO|Releasing lport c44344cc-bddd-480f-8312-6b31c21f97df from this chassis (sb_readonly=0)
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.919 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[3565f4cd-af4a-4fd2-b93f-2dbcbafede84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.920 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/2ff1e6ce-b114-4644-bacd-9ad17a33b1a4.pid.haproxy
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:57:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:15.921 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'env', 'PROCESS_TAG=haproxy-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2ff1e6ce-b114-4644-bacd-9ad17a33b1a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:57:15 compute-0 nova_compute[187223]: 2025-11-28 17:57:15.929 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:16 compute-0 podman[218221]: 2025-11-28 17:57:16.273560895 +0000 UTC m=+0.050171130 container create 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 17:57:16 compute-0 systemd[1]: Started libpod-conmon-67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee.scope.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.332 187227 DEBUG nova.compute.manager [req-71cdb08c-d270-4c60-b787-198b048b0d0f req-75c4dd6b-7dbf-4692-99d4-bd94e6a7a957 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.333 187227 DEBUG oslo_concurrency.lockutils [req-71cdb08c-d270-4c60-b787-198b048b0d0f req-75c4dd6b-7dbf-4692-99d4-bd94e6a7a957 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.334 187227 DEBUG oslo_concurrency.lockutils [req-71cdb08c-d270-4c60-b787-198b048b0d0f req-75c4dd6b-7dbf-4692-99d4-bd94e6a7a957 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.334 187227 DEBUG oslo_concurrency.lockutils [req-71cdb08c-d270-4c60-b787-198b048b0d0f req-75c4dd6b-7dbf-4692-99d4-bd94e6a7a957 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.334 187227 DEBUG nova.compute.manager [req-71cdb08c-d270-4c60-b787-198b048b0d0f req-75c4dd6b-7dbf-4692-99d4-bd94e6a7a957 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Processing event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:57:16 compute-0 podman[218221]: 2025-11-28 17:57:16.245582587 +0000 UTC m=+0.022192872 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:57:16 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:57:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90deb0c21e5f447735665ca476ac873dc9b8bd20126f7324ff20efba85ff7647/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:57:16 compute-0 podman[218221]: 2025-11-28 17:57:16.365688816 +0000 UTC m=+0.142299081 container init 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 17:57:16 compute-0 podman[218221]: 2025-11-28 17:57:16.373578334 +0000 UTC m=+0.150188569 container start 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 17:57:16 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [NOTICE]   (218240) : New worker (218242) forked
Nov 28 17:57:16 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [NOTICE]   (218240) : Loading success.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.569 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352636.5692272, 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.570 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] VM Started (Lifecycle Event)
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.572 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.576 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.580 187227 INFO nova.virt.libvirt.driver [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Instance spawned successfully.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.580 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.607 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.613 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.616 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.616 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.616 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.617 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.617 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.618 187227 DEBUG nova.virt.libvirt.driver [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.648 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.648 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352636.569493, 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.649 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] VM Paused (Lifecycle Event)
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.679 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.684 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352636.5748925, 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.685 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] VM Resumed (Lifecycle Event)
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.691 187227 INFO nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Took 6.75 seconds to spawn the instance on the hypervisor.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.691 187227 DEBUG nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.709 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.713 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.749 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.779 187227 INFO nova.compute.manager [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Took 7.20 seconds to build instance.
Nov 28 17:57:16 compute-0 nova_compute[187223]: 2025-11-28 17:57:16.798 187227 DEBUG oslo_concurrency.lockutils [None req-3804588e-a6fd-4b00-8322-0a7f5894efe0 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.253 187227 DEBUG nova.network.neutron [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updated VIF entry in instance network info cache for port 3868291d-fb0c-4bb9-94d3-897a190444b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.253 187227 DEBUG nova.network.neutron [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updating instance_info_cache with network_info: [{"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.275 187227 DEBUG oslo_concurrency.lockutils [req-e29b91b6-9e27-4510-ae4e-cffc00b925ff req-5113f111-b44b-4c90-84f3-3fa4a532a478 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.294 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.425 187227 DEBUG nova.compute.manager [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.426 187227 DEBUG oslo_concurrency.lockutils [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.426 187227 DEBUG oslo_concurrency.lockutils [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.427 187227 DEBUG oslo_concurrency.lockutils [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.427 187227 DEBUG nova.compute.manager [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] No waiting events found dispatching network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:57:18 compute-0 nova_compute[187223]: 2025-11-28 17:57:18.427 187227 WARNING nova.compute.manager [req-6814f3e7-37ce-422f-9cd4-e543cc12fb9e req-db054b4e-0eee-4eec-a36c-58ce6ac317f2 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received unexpected event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 for instance with vm_state active and task_state None.
Nov 28 17:57:19 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 17:57:20 compute-0 nova_compute[187223]: 2025-11-28 17:57:20.237 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:23 compute-0 nova_compute[187223]: 2025-11-28 17:57:23.298 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:25 compute-0 nova_compute[187223]: 2025-11-28 17:57:25.264 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:26 compute-0 podman[218259]: 2025-11-28 17:57:26.205574011 +0000 UTC m=+0.058844530 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 17:57:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:27.708 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:27.709 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:57:27.710 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:28 compute-0 nova_compute[187223]: 2025-11-28 17:57:28.300 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:28 compute-0 ovn_controller[95574]: 2025-11-28T17:57:28Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:aa:57 10.100.0.13
Nov 28 17:57:28 compute-0 ovn_controller[95574]: 2025-11-28T17:57:28Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:aa:57 10.100.0.13
Nov 28 17:57:29 compute-0 podman[197556]: time="2025-11-28T17:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:57:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:57:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Nov 28 17:57:30 compute-0 nova_compute[187223]: 2025-11-28 17:57:30.266 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: ERROR   17:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: ERROR   17:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: ERROR   17:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: ERROR   17:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: ERROR   17:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:57:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:57:33 compute-0 nova_compute[187223]: 2025-11-28 17:57:33.302 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:34 compute-0 podman[218293]: 2025-11-28 17:57:34.184115372 +0000 UTC m=+0.052249530 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:57:35 compute-0 nova_compute[187223]: 2025-11-28 17:57:35.271 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:38 compute-0 nova_compute[187223]: 2025-11-28 17:57:38.303 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:40 compute-0 nova_compute[187223]: 2025-11-28 17:57:40.271 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:40 compute-0 nova_compute[187223]: 2025-11-28 17:57:40.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:40 compute-0 nova_compute[187223]: 2025-11-28 17:57:40.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 17:57:40 compute-0 nova_compute[187223]: 2025-11-28 17:57:40.698 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 17:57:42 compute-0 podman[218313]: 2025-11-28 17:57:42.227912902 +0000 UTC m=+0.073235456 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:57:42 compute-0 podman[218314]: 2025-11-28 17:57:42.262705267 +0000 UTC m=+0.109128833 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 17:57:43 compute-0 nova_compute[187223]: 2025-11-28 17:57:43.305 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:45 compute-0 podman[218357]: 2025-11-28 17:57:45.194651405 +0000 UTC m=+0.061556588 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter)
Nov 28 17:57:45 compute-0 nova_compute[187223]: 2025-11-28 17:57:45.274 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:45 compute-0 nova_compute[187223]: 2025-11-28 17:57:45.697 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:45 compute-0 nova_compute[187223]: 2025-11-28 17:57:45.698 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:57:45 compute-0 ovn_controller[95574]: 2025-11-28T17:57:45Z|00194|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 17:57:47 compute-0 nova_compute[187223]: 2025-11-28 17:57:47.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:47 compute-0 nova_compute[187223]: 2025-11-28 17:57:47.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:48 compute-0 nova_compute[187223]: 2025-11-28 17:57:48.307 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:49 compute-0 nova_compute[187223]: 2025-11-28 17:57:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:50 compute-0 nova_compute[187223]: 2025-11-28 17:57:50.275 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:53 compute-0 nova_compute[187223]: 2025-11-28 17:57:53.310 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:53 compute-0 nova_compute[187223]: 2025-11-28 17:57:53.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.700 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.920 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.920 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.921 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 17:57:54 compute-0 nova_compute[187223]: 2025-11-28 17:57:54.921 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:57:55 compute-0 nova_compute[187223]: 2025-11-28 17:57:55.328 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.553 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updating instance_info_cache with network_info: [{"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.576 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.576 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.577 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.577 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.602 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.602 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.602 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.603 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.674 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:56 compute-0 podman[218380]: 2025-11-28 17:57:56.734499776 +0000 UTC m=+0.071990240 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.748 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.750 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.808 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.960 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.961 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.31267929077148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.962 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:57:56 compute-0 nova_compute[187223]: 2025-11-28 17:57:56.962 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.134 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.134 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.134 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.230 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.248 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.275 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.276 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.382 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:57 compute-0 nova_compute[187223]: 2025-11-28 17:57:57.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:57:58 compute-0 nova_compute[187223]: 2025-11-28 17:57:58.312 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:57:59 compute-0 podman[197556]: time="2025-11-28T17:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:57:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:57:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3067 "" "Go-http-client/1.1"
Nov 28 17:58:00 compute-0 nova_compute[187223]: 2025-11-28 17:58:00.372 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: ERROR   17:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: ERROR   17:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: ERROR   17:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: ERROR   17:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: ERROR   17:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:58:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:58:03 compute-0 nova_compute[187223]: 2025-11-28 17:58:03.316 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:05 compute-0 podman[218411]: 2025-11-28 17:58:05.179651801 +0000 UTC m=+0.048539613 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 17:58:05 compute-0 nova_compute[187223]: 2025-11-28 17:58:05.381 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:06 compute-0 nova_compute[187223]: 2025-11-28 17:58:06.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:06 compute-0 nova_compute[187223]: 2025-11-28 17:58:06.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 17:58:08 compute-0 nova_compute[187223]: 2025-11-28 17:58:08.318 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:10 compute-0 nova_compute[187223]: 2025-11-28 17:58:10.416 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:13 compute-0 podman[218430]: 2025-11-28 17:58:13.191327858 +0000 UTC m=+0.058719927 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:58:13 compute-0 podman[218431]: 2025-11-28 17:58:13.277976801 +0000 UTC m=+0.139819319 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 17:58:13 compute-0 nova_compute[187223]: 2025-11-28 17:58:13.320 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:15 compute-0 nova_compute[187223]: 2025-11-28 17:58:15.417 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:16 compute-0 podman[218472]: 2025-11-28 17:58:16.191881208 +0000 UTC m=+0.052509888 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64)
Nov 28 17:58:18 compute-0 nova_compute[187223]: 2025-11-28 17:58:18.322 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:20 compute-0 nova_compute[187223]: 2025-11-28 17:58:20.420 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:23 compute-0 nova_compute[187223]: 2025-11-28 17:58:23.085 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Creating tmpfile /var/lib/nova/instances/tmpj1gcdabi to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Nov 28 17:58:23 compute-0 nova_compute[187223]: 2025-11-28 17:58:23.218 187227 DEBUG nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj1gcdabi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Nov 28 17:58:23 compute-0 nova_compute[187223]: 2025-11-28 17:58:23.324 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:24 compute-0 nova_compute[187223]: 2025-11-28 17:58:24.213 187227 DEBUG nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj1gcdabi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d30319e-5bea-48ef-ae70-71e215054fc4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Nov 28 17:58:24 compute-0 nova_compute[187223]: 2025-11-28 17:58:24.234 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:58:24 compute-0 nova_compute[187223]: 2025-11-28 17:58:24.235 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:58:24 compute-0 nova_compute[187223]: 2025-11-28 17:58:24.235 187227 DEBUG nova.network.neutron [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:58:25 compute-0 nova_compute[187223]: 2025-11-28 17:58:25.421 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:27 compute-0 podman[218493]: 2025-11-28 17:58:27.186218143 +0000 UTC m=+0.052870788 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 17:58:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:27.710 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:27.712 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:27.713 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.327 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.430 187227 DEBUG nova.network.neutron [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Updating instance_info_cache with network_info: [{"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.448 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.451 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj1gcdabi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d30319e-5bea-48ef-ae70-71e215054fc4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.452 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Creating instance directory: /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.453 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Creating disk.info with the contents: {'/var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk': 'qcow2', '/var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.453 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.454 187227 DEBUG nova.objects.instance [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5d30319e-5bea-48ef-ae70-71e215054fc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.482 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.555 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.556 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.557 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.571 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.642 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.643 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.676 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.678 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.678 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.753 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.755 187227 DEBUG nova.virt.disk.api [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Checking if we can resize image /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.756 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.814 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.815 187227 DEBUG nova.virt.disk.api [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Cannot resize image /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.816 187227 DEBUG nova.objects.instance [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d30319e-5bea-48ef-ae70-71e215054fc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.835 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.864 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config 485376" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.867 187227 DEBUG nova.virt.libvirt.volume.remotefs [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config to /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Nov 28 17:58:28 compute-0 nova_compute[187223]: 2025-11-28 17:58:28.868 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.364 187227 DEBUG oslo_concurrency.processutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4/disk.config /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.366 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.369 187227 DEBUG nova.virt.libvirt.vif [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:56:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-354005683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-354005683',id=25,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:56:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d29c5331f43848f995592402975b88d1',ramdisk_id='',reservation_id='r-zhi6qxko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:56:59Z,user_data=None,user_id='0be3599a4e1a4dc2bece41c2543e4bc9',uuid=5d30319e-5bea-48ef-ae70-71e215054fc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.370 187227 DEBUG nova.network.os_vif_util [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.373 187227 DEBUG nova.network.os_vif_util [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.374 187227 DEBUG os_vif [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.375 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.376 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.378 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.383 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.384 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02b7708c-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.385 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02b7708c-53, col_values=(('external_ids', {'iface-id': '02b7708c-53c9-4a33-81e8-3c777c77986b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:f4:f8', 'vm-uuid': '5d30319e-5bea-48ef-ae70-71e215054fc4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.388 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:29 compute-0 NetworkManager[55763]: <info>  [1764352709.3896] manager: (tap02b7708c-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.391 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.397 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.398 187227 INFO os_vif [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53')
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.399 187227 DEBUG nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Nov 28 17:58:29 compute-0 nova_compute[187223]: 2025-11-28 17:58:29.399 187227 DEBUG nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj1gcdabi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d30319e-5bea-48ef-ae70-71e215054fc4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Nov 28 17:58:29 compute-0 podman[197556]: time="2025-11-28T17:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:58:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:58:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 28 17:58:30 compute-0 nova_compute[187223]: 2025-11-28 17:58:30.423 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:31.301 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:58:31 compute-0 nova_compute[187223]: 2025-11-28 17:58:31.302 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:31.303 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: ERROR   17:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: ERROR   17:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: ERROR   17:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: ERROR   17:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: ERROR   17:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:58:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:58:31 compute-0 nova_compute[187223]: 2025-11-28 17:58:31.617 187227 DEBUG nova.network.neutron [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Port 02b7708c-53c9-4a33-81e8-3c777c77986b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Nov 28 17:58:31 compute-0 nova_compute[187223]: 2025-11-28 17:58:31.618 187227 DEBUG nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpj1gcdabi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5d30319e-5bea-48ef-ae70-71e215054fc4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Nov 28 17:58:31 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 28 17:58:31 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 28 17:58:31 compute-0 kernel: tap02b7708c-53: entered promiscuous mode
Nov 28 17:58:31 compute-0 NetworkManager[55763]: <info>  [1764352711.9174] manager: (tap02b7708c-53): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Nov 28 17:58:31 compute-0 ovn_controller[95574]: 2025-11-28T17:58:31Z|00195|binding|INFO|Claiming lport 02b7708c-53c9-4a33-81e8-3c777c77986b for this additional chassis.
Nov 28 17:58:31 compute-0 ovn_controller[95574]: 2025-11-28T17:58:31Z|00196|binding|INFO|02b7708c-53c9-4a33-81e8-3c777c77986b: Claiming fa:16:3e:90:f4:f8 10.100.0.5
Nov 28 17:58:31 compute-0 nova_compute[187223]: 2025-11-28 17:58:31.918 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:31 compute-0 ovn_controller[95574]: 2025-11-28T17:58:31Z|00197|binding|INFO|Setting lport 02b7708c-53c9-4a33-81e8-3c777c77986b ovn-installed in OVS
Nov 28 17:58:31 compute-0 nova_compute[187223]: 2025-11-28 17:58:31.935 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:31 compute-0 systemd-udevd[218572]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:58:31 compute-0 systemd-machined[153517]: New machine qemu-19-instance-00000019.
Nov 28 17:58:31 compute-0 NetworkManager[55763]: <info>  [1764352711.9628] device (tap02b7708c-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:58:31 compute-0 NetworkManager[55763]: <info>  [1764352711.9650] device (tap02b7708c-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:58:31 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Nov 28 17:58:32 compute-0 nova_compute[187223]: 2025-11-28 17:58:32.404 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352712.40434, 5d30319e-5bea-48ef-ae70-71e215054fc4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:58:32 compute-0 nova_compute[187223]: 2025-11-28 17:58:32.405 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] VM Started (Lifecycle Event)
Nov 28 17:58:32 compute-0 nova_compute[187223]: 2025-11-28 17:58:32.426 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:58:33 compute-0 nova_compute[187223]: 2025-11-28 17:58:33.155 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352713.1555755, 5d30319e-5bea-48ef-ae70-71e215054fc4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:58:33 compute-0 nova_compute[187223]: 2025-11-28 17:58:33.156 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] VM Resumed (Lifecycle Event)
Nov 28 17:58:33 compute-0 nova_compute[187223]: 2025-11-28 17:58:33.184 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:58:33 compute-0 nova_compute[187223]: 2025-11-28 17:58:33.188 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:58:33 compute-0 nova_compute[187223]: 2025-11-28 17:58:33.209 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Nov 28 17:58:34 compute-0 nova_compute[187223]: 2025-11-28 17:58:34.453 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:35 compute-0 ovn_controller[95574]: 2025-11-28T17:58:35Z|00198|binding|INFO|Claiming lport 02b7708c-53c9-4a33-81e8-3c777c77986b for this chassis.
Nov 28 17:58:35 compute-0 ovn_controller[95574]: 2025-11-28T17:58:35Z|00199|binding|INFO|02b7708c-53c9-4a33-81e8-3c777c77986b: Claiming fa:16:3e:90:f4:f8 10.100.0.5
Nov 28 17:58:35 compute-0 ovn_controller[95574]: 2025-11-28T17:58:35Z|00200|binding|INFO|Setting lport 02b7708c-53c9-4a33-81e8-3c777c77986b up in Southbound
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.292 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:f4:f8 10.100.0.5'], port_security=['fa:16:3e:90:f4:f8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5d30319e-5bea-48ef-ae70-71e215054fc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd29c5331f43848f995592402975b88d1', 'neutron:revision_number': '11', 'neutron:security_group_ids': '4a4ddb51-0ed6-4517-9c9f-37fff06e30e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9230971-b926-4b1d-9b28-cbd943422cad, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=02b7708c-53c9-4a33-81e8-3c777c77986b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.293 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 02b7708c-53c9-4a33-81e8-3c777c77986b in datapath 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 bound to our chassis
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.294 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.313 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[584544bf-049a-4439-b7b4-2f25ee5923bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.343 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[858a654a-3936-4948-98a0-be3578a7cb74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.346 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8e6488-5cca-49cb-815a-3644cb81d5ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.374 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c6713f10-0fca-4aa3-a60d-973ee8aaf44d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.390 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f6156295-7386-476e-a37c-442469408f90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ff1e6ce-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:92:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581035, 'reachable_time': 15569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218600, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.402 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[68b9cc3a-3522-412b-8e9c-1b279be2367f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2ff1e6ce-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581046, 'tstamp': 581046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218601, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2ff1e6ce-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581048, 'tstamp': 581048}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218601, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.403 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ff1e6ce-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.405 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.406 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.406 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ff1e6ce-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.406 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.407 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ff1e6ce-b0, col_values=(('external_ids', {'iface-id': 'c44344cc-bddd-480f-8312-6b31c21f97df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:35 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:35.407 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.425 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.506 187227 INFO nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Post operation of migration started
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.933 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.933 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:58:35 compute-0 nova_compute[187223]: 2025-11-28 17:58:35.933 187227 DEBUG nova.network.neutron [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:58:36 compute-0 podman[218602]: 2025-11-28 17:58:36.204678366 +0000 UTC m=+0.061090205 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 17:58:36 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:36.305 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.283 187227 DEBUG nova.network.neutron [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Updating instance_info_cache with network_info: [{"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.313 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-5d30319e-5bea-48ef-ae70-71e215054fc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.336 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.337 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.337 187227 DEBUG oslo_concurrency.lockutils [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:37 compute-0 nova_compute[187223]: 2025-11-28 17:58:37.342 187227 INFO nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Nov 28 17:58:37 compute-0 virtqemud[186845]: Domain id=19 name='instance-00000019' uuid=5d30319e-5bea-48ef-ae70-71e215054fc4 is tainted: custom-monitor
Nov 28 17:58:38 compute-0 nova_compute[187223]: 2025-11-28 17:58:38.350 187227 INFO nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Nov 28 17:58:39 compute-0 nova_compute[187223]: 2025-11-28 17:58:39.356 187227 INFO nova.virt.libvirt.driver [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Nov 28 17:58:39 compute-0 nova_compute[187223]: 2025-11-28 17:58:39.361 187227 DEBUG nova.compute.manager [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:58:39 compute-0 nova_compute[187223]: 2025-11-28 17:58:39.382 187227 DEBUG nova.objects.instance [None req-9bb1036f-a718-4d9c-8b58-d2c633ac25ae a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Nov 28 17:58:39 compute-0 nova_compute[187223]: 2025-11-28 17:58:39.456 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:40 compute-0 nova_compute[187223]: 2025-11-28 17:58:40.427 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.543 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.544 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.544 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.544 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.544 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.545 187227 INFO nova.compute.manager [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Terminating instance
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.546 187227 DEBUG nova.compute.manager [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:58:43 compute-0 kernel: tap3868291d-fb (unregistering): left promiscuous mode
Nov 28 17:58:43 compute-0 NetworkManager[55763]: <info>  [1764352723.5839] device (tap3868291d-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.587 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_controller[95574]: 2025-11-28T17:58:43Z|00201|binding|INFO|Releasing lport 3868291d-fb0c-4bb9-94d3-897a190444b7 from this chassis (sb_readonly=0)
Nov 28 17:58:43 compute-0 ovn_controller[95574]: 2025-11-28T17:58:43Z|00202|binding|INFO|Setting lport 3868291d-fb0c-4bb9-94d3-897a190444b7 down in Southbound
Nov 28 17:58:43 compute-0 ovn_controller[95574]: 2025-11-28T17:58:43Z|00203|binding|INFO|Removing iface tap3868291d-fb ovn-installed in OVS
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.589 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.598 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:aa:57 10.100.0.13'], port_security=['fa:16:3e:7b:aa:57 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd29c5331f43848f995592402975b88d1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4a4ddb51-0ed6-4517-9c9f-37fff06e30e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9230971-b926-4b1d-9b28-cbd943422cad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=3868291d-fb0c-4bb9-94d3-897a190444b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.599 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.599 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 3868291d-fb0c-4bb9-94d3-897a190444b7 in datapath 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 unbound from our chassis
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.600 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.632 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[85a576f1-0931-4f0d-8077-61a14347364b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 28 17:58:43 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001a.scope: Consumed 16.296s CPU time.
Nov 28 17:58:43 compute-0 systemd-machined[153517]: Machine qemu-18-instance-0000001a terminated.
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.666 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[ab35d1f7-2dfe-421a-ae0f-1ad40ef987fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.669 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[de44d686-a835-4b82-a1ce-06ca4cbd6bf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 podman[218625]: 2025-11-28 17:58:43.734199037 +0000 UTC m=+0.131161679 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.738 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[c73a59a6-846b-464b-a57b-c7e2bfa47925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 podman[218628]: 2025-11-28 17:58:43.753269698 +0000 UTC m=+0.142372263 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.762 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd78045-2b6f-42a6-910d-e8b5848f7fad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2ff1e6ce-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:92:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581035, 'reachable_time': 15569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218678, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.770 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.775 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.780 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1070c8c2-104b-45af-abe8-9fa08b7620dd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2ff1e6ce-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581046, 'tstamp': 581046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218683, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2ff1e6ce-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581048, 'tstamp': 581048}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218683, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.781 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ff1e6ce-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.783 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.788 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ff1e6ce-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.789 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.789 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.789 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2ff1e6ce-b0, col_values=(('external_ids', {'iface-id': 'c44344cc-bddd-480f-8312-6b31c21f97df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:43 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:43.789 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.810 187227 INFO nova.virt.libvirt.driver [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Instance destroyed successfully.
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.811 187227 DEBUG nova.objects.instance [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lazy-loading 'resources' on Instance uuid 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.822 187227 DEBUG nova.virt.libvirt.vif [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:57:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1751850922',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1751850922',id=26,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:57:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d29c5331f43848f995592402975b88d1',ramdisk_id='',reservation_id='r-dr437yks',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:57:16Z,user_data=None,user_id='0be3599a4e1a4dc2bece41c2543e4bc9',uuid=3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.822 187227 DEBUG nova.network.os_vif_util [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converting VIF {"id": "3868291d-fb0c-4bb9-94d3-897a190444b7", "address": "fa:16:3e:7b:aa:57", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3868291d-fb", "ovs_interfaceid": "3868291d-fb0c-4bb9-94d3-897a190444b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.823 187227 DEBUG nova.network.os_vif_util [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.823 187227 DEBUG os_vif [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.824 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.825 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3868291d-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.826 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.827 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.830 187227 INFO os_vif [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:aa:57,bridge_name='br-int',has_traffic_filtering=True,id=3868291d-fb0c-4bb9-94d3-897a190444b7,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3868291d-fb')
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.830 187227 INFO nova.virt.libvirt.driver [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Deleting instance files /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c_del
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.831 187227 INFO nova.virt.libvirt.driver [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Deletion of /var/lib/nova/instances/3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c_del complete
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.881 187227 INFO nova.compute.manager [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.881 187227 DEBUG oslo.service.loopingcall [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.882 187227 DEBUG nova.compute.manager [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:58:43 compute-0 nova_compute[187223]: 2025-11-28 17:58:43.882 187227 DEBUG nova.network.neutron [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.474 187227 DEBUG nova.compute.manager [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-unplugged-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.474 187227 DEBUG oslo_concurrency.lockutils [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.475 187227 DEBUG oslo_concurrency.lockutils [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.475 187227 DEBUG oslo_concurrency.lockutils [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.475 187227 DEBUG nova.compute.manager [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] No waiting events found dispatching network-vif-unplugged-3868291d-fb0c-4bb9-94d3-897a190444b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.475 187227 DEBUG nova.compute.manager [req-1f247942-19bf-40ad-9df1-0e5f2c5bcbf3 req-af4ec110-62a1-4785-be63-6c0019582a10 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-unplugged-3868291d-fb0c-4bb9-94d3-897a190444b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.889 187227 DEBUG nova.network.neutron [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.906 187227 INFO nova.compute.manager [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Took 1.02 seconds to deallocate network for instance.
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.975 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:44 compute-0 nova_compute[187223]: 2025-11-28 17:58:44.976 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.103 187227 DEBUG nova.compute.provider_tree [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.142 187227 DEBUG nova.scheduler.client.report [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.170 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.202 187227 INFO nova.scheduler.client.report [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Deleted allocations for instance 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.277 187227 DEBUG oslo_concurrency.lockutils [None req-9b33de58-191d-4b9b-921a-96c0efc31240 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.429 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.980 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "5d30319e-5bea-48ef-ae70-71e215054fc4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.980 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.980 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.981 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.981 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.982 187227 INFO nova.compute.manager [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Terminating instance
Nov 28 17:58:45 compute-0 nova_compute[187223]: 2025-11-28 17:58:45.982 187227 DEBUG nova.compute.manager [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 28 17:58:46 compute-0 kernel: tap02b7708c-53 (unregistering): left promiscuous mode
Nov 28 17:58:46 compute-0 NetworkManager[55763]: <info>  [1764352726.0065] device (tap02b7708c-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.048 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 ovn_controller[95574]: 2025-11-28T17:58:46Z|00204|binding|INFO|Releasing lport 02b7708c-53c9-4a33-81e8-3c777c77986b from this chassis (sb_readonly=0)
Nov 28 17:58:46 compute-0 ovn_controller[95574]: 2025-11-28T17:58:46Z|00205|binding|INFO|Setting lport 02b7708c-53c9-4a33-81e8-3c777c77986b down in Southbound
Nov 28 17:58:46 compute-0 ovn_controller[95574]: 2025-11-28T17:58:46Z|00206|binding|INFO|Removing iface tap02b7708c-53 ovn-installed in OVS
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.051 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.059 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:f4:f8 10.100.0.5'], port_security=['fa:16:3e:90:f4:f8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5d30319e-5bea-48ef-ae70-71e215054fc4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd29c5331f43848f995592402975b88d1', 'neutron:revision_number': '13', 'neutron:security_group_ids': '4a4ddb51-0ed6-4517-9c9f-37fff06e30e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9230971-b926-4b1d-9b28-cbd943422cad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=02b7708c-53c9-4a33-81e8-3c777c77986b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.060 104433 INFO neutron.agent.ovn.metadata.agent [-] Port 02b7708c-53c9-4a33-81e8-3c777c77986b in datapath 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 unbound from our chassis
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.062 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.062 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.063 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b62ef941-ff49-4539-8348-16a54c563ebf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.063 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 namespace which is not needed anymore
Nov 28 17:58:46 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Nov 28 17:58:46 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 1.426s CPU time.
Nov 28 17:58:46 compute-0 systemd-machined[153517]: Machine qemu-19-instance-00000019 terminated.
Nov 28 17:58:46 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [NOTICE]   (218240) : haproxy version is 2.8.14-c23fe91
Nov 28 17:58:46 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [NOTICE]   (218240) : path to executable is /usr/sbin/haproxy
Nov 28 17:58:46 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [WARNING]  (218240) : Exiting Master process...
Nov 28 17:58:46 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [ALERT]    (218240) : Current worker (218242) exited with code 143 (Terminated)
Nov 28 17:58:46 compute-0 neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4[218236]: [WARNING]  (218240) : All workers exited. Exiting... (0)
Nov 28 17:58:46 compute-0 systemd[1]: libpod-67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee.scope: Deactivated successfully.
Nov 28 17:58:46 compute-0 podman[218721]: 2025-11-28 17:58:46.199885858 +0000 UTC m=+0.043119536 container died 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:58:46 compute-0 NetworkManager[55763]: <info>  [1764352726.2003] manager: (tap02b7708c-53): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Nov 28 17:58:46 compute-0 podman[218721]: 2025-11-28 17:58:46.230568644 +0000 UTC m=+0.073802322 container cleanup 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 17:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-90deb0c21e5f447735665ca476ac873dc9b8bd20126f7324ff20efba85ff7647-merged.mount: Deactivated successfully.
Nov 28 17:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee-userdata-shm.mount: Deactivated successfully.
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.243 187227 INFO nova.virt.libvirt.driver [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Instance destroyed successfully.
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.244 187227 DEBUG nova.objects.instance [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lazy-loading 'resources' on Instance uuid 5d30319e-5bea-48ef-ae70-71e215054fc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:58:46 compute-0 systemd[1]: libpod-conmon-67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee.scope: Deactivated successfully.
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.258 187227 DEBUG nova.virt.libvirt.vif [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T17:56:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-354005683',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-354005683',id=25,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:56:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d29c5331f43848f995592402975b88d1',ramdisk_id='',reservation_id='r-zhi6qxko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-480471442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:58:39Z,user_data=None,user_id='0be3599a4e1a4dc2bece41c2543e4bc9',uuid=5d30319e-5bea-48ef-ae70-71e215054fc4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.259 187227 DEBUG nova.network.os_vif_util [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converting VIF {"id": "02b7708c-53c9-4a33-81e8-3c777c77986b", "address": "fa:16:3e:90:f4:f8", "network": {"id": "2ff1e6ce-b114-4644-bacd-9ad17a33b1a4", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-706434666-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d29c5331f43848f995592402975b88d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02b7708c-53", "ovs_interfaceid": "02b7708c-53c9-4a33-81e8-3c777c77986b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.259 187227 DEBUG nova.network.os_vif_util [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.260 187227 DEBUG os_vif [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.261 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.261 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02b7708c-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.263 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.265 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.267 187227 DEBUG nova.compute.manager [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Received event network-vif-unplugged-02b7708c-53c9-4a33-81e8-3c777c77986b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.268 187227 DEBUG oslo_concurrency.lockutils [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.268 187227 DEBUG oslo_concurrency.lockutils [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.268 187227 DEBUG oslo_concurrency.lockutils [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.269 187227 DEBUG nova.compute.manager [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] No waiting events found dispatching network-vif-unplugged-02b7708c-53c9-4a33-81e8-3c777c77986b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.269 187227 DEBUG nova.compute.manager [req-2607160f-1ba0-4c42-8e83-f81303238990 req-6299f420-ec2e-4b46-a87e-426728167936 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Received event network-vif-unplugged-02b7708c-53c9-4a33-81e8-3c777c77986b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.270 187227 INFO os_vif [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:f4:f8,bridge_name='br-int',has_traffic_filtering=True,id=02b7708c-53c9-4a33-81e8-3c777c77986b,network=Network(2ff1e6ce-b114-4644-bacd-9ad17a33b1a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02b7708c-53')
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.271 187227 INFO nova.virt.libvirt.driver [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Deleting instance files /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4_del
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.272 187227 INFO nova.virt.libvirt.driver [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Deletion of /var/lib/nova/instances/5d30319e-5bea-48ef-ae70-71e215054fc4_del complete
Nov 28 17:58:46 compute-0 podman[218765]: 2025-11-28 17:58:46.293900214 +0000 UTC m=+0.039707938 container remove 67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.299 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8251d6fb-3335-423e-aa0b-b00fe957d650]: (4, ('Fri Nov 28 05:58:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 (67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee)\n67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee\nFri Nov 28 05:58:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 (67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee)\n67b6a17631f62459f84b114ff539dfbaea7a193710a3dbe3755f77418d7004ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 podman[218758]: 2025-11-28 17:58:46.301075161 +0000 UTC m=+0.055170645 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter)
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.301 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[88993d27-8962-4354-8f3b-66bc0498ecbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.301 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ff1e6ce-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.303 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 kernel: tap2ff1e6ce-b0: left promiscuous mode
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.314 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.315 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.316 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[993e3d3c-c8a6-46e7-bb2a-674f5cfb97c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.329 187227 INFO nova.compute.manager [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.330 187227 DEBUG oslo.service.loopingcall [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.330 187227 DEBUG nova.compute.manager [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.330 187227 DEBUG nova.network.neutron [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.334 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[43327a8d-80e5-4c80-94ab-b6e52ee13a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.335 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b385198e-a233-412d-ba66-573b1661210b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.348 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[afd6e11b-9e47-4371-83f6-c3b915912e76]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581028, 'reachable_time': 42701, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218802, 'error': None, 'target': 'ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d2ff1e6ce\x2db114\x2d4644\x2dbacd\x2d9ad17a33b1a4.mount: Deactivated successfully.
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.352 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2ff1e6ce-b114-4644-bacd-9ad17a33b1a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 17:58:46 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:58:46.352 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[53144f26-9cb5-4fab-a8ab-1d7d97d30caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.589 187227 DEBUG nova.compute.manager [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.590 187227 DEBUG oslo_concurrency.lockutils [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.591 187227 DEBUG oslo_concurrency.lockutils [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.591 187227 DEBUG oslo_concurrency.lockutils [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.591 187227 DEBUG nova.compute.manager [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] No waiting events found dispatching network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.592 187227 WARNING nova.compute.manager [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received unexpected event network-vif-plugged-3868291d-fb0c-4bb9-94d3-897a190444b7 for instance with vm_state deleted and task_state None.
Nov 28 17:58:46 compute-0 nova_compute[187223]: 2025-11-28 17:58:46.592 187227 DEBUG nova.compute.manager [req-19df8ba4-39a9-443f-9525-66c406343413 req-d8ffe327-eefc-4a1a-b955-f21a9c221b0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Received event network-vif-deleted-3868291d-fb0c-4bb9-94d3-897a190444b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.220 187227 DEBUG nova.network.neutron [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.239 187227 INFO nova.compute.manager [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Took 0.91 seconds to deallocate network for instance.
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.303 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.304 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.309 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.366 187227 INFO nova.scheduler.client.report [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Deleted allocations for instance 5d30319e-5bea-48ef-ae70-71e215054fc4
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.429 187227 DEBUG oslo_concurrency.lockutils [None req-27867e98-2021-4275-9561-b1301ce95c49 0be3599a4e1a4dc2bece41c2543e4bc9 d29c5331f43848f995592402975b88d1 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.701 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.716 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.716 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:47 compute-0 nova_compute[187223]: 2025-11-28 17:58:47.716 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.412 187227 DEBUG nova.compute.manager [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Received event network-vif-plugged-02b7708c-53c9-4a33-81e8-3c777c77986b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.413 187227 DEBUG oslo_concurrency.lockutils [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.415 187227 DEBUG oslo_concurrency.lockutils [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.415 187227 DEBUG oslo_concurrency.lockutils [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "5d30319e-5bea-48ef-ae70-71e215054fc4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.416 187227 DEBUG nova.compute.manager [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] No waiting events found dispatching network-vif-plugged-02b7708c-53c9-4a33-81e8-3c777c77986b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.416 187227 WARNING nova.compute.manager [req-c88507c8-1e4a-439e-b8d7-5d9efe64fc19 req-c17a0abc-58eb-489e-b131-de5b60633e0c 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Received unexpected event network-vif-plugged-02b7708c-53c9-4a33-81e8-3c777c77986b for instance with vm_state deleted and task_state None.
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:48 compute-0 nova_compute[187223]: 2025-11-28 17:58:48.737 187227 DEBUG nova.compute.manager [req-9f9536cf-ccad-44ee-8a27-70297a3684e1 req-12fa31aa-254c-4523-8c27-e272897e9dd0 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Received event network-vif-deleted-02b7708c-53c9-4a33-81e8-3c777c77986b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:58:49 compute-0 nova_compute[187223]: 2025-11-28 17:58:49.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:50 compute-0 nova_compute[187223]: 2025-11-28 17:58:50.430 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:51 compute-0 nova_compute[187223]: 2025-11-28 17:58:51.264 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:54 compute-0 nova_compute[187223]: 2025-11-28 17:58:54.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:54 compute-0 nova_compute[187223]: 2025-11-28 17:58:54.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:58:54 compute-0 nova_compute[187223]: 2025-11-28 17:58:54.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:58:54 compute-0 nova_compute[187223]: 2025-11-28 17:58:54.736 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:58:55 compute-0 nova_compute[187223]: 2025-11-28 17:58:55.432 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.266 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.710 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.711 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.870 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.872 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5842MB free_disk=73.34138488769531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.872 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.873 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.942 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:58:56 compute-0 nova_compute[187223]: 2025-11-28 17:58:56.942 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:58:57 compute-0 nova_compute[187223]: 2025-11-28 17:58:57.038 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:58:57 compute-0 nova_compute[187223]: 2025-11-28 17:58:57.060 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:58:57 compute-0 nova_compute[187223]: 2025-11-28 17:58:57.094 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:58:57 compute-0 nova_compute[187223]: 2025-11-28 17:58:57.094 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.095 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:58 compute-0 podman[218804]: 2025-11-28 17:58:58.236700912 +0000 UTC m=+0.087227890 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.808 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352723.8079689, 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.809 187227 INFO nova.compute.manager [-] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] VM Stopped (Lifecycle Event)
Nov 28 17:58:58 compute-0 nova_compute[187223]: 2025-11-28 17:58:58.828 187227 DEBUG nova.compute.manager [None req-4870443a-fa06-4257-a0a3-6f9c23c50866 - - - - - -] [instance: 3a32bfb8-38e2-4cde-aa4f-0a8f6c48071c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:58:59 compute-0 podman[197556]: time="2025-11-28T17:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:58:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:58:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 28 17:59:00 compute-0 nova_compute[187223]: 2025-11-28 17:59:00.483 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:01 compute-0 nova_compute[187223]: 2025-11-28 17:59:01.249 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352726.239623, 5d30319e-5bea-48ef-ae70-71e215054fc4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:59:01 compute-0 nova_compute[187223]: 2025-11-28 17:59:01.249 187227 INFO nova.compute.manager [-] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] VM Stopped (Lifecycle Event)
Nov 28 17:59:01 compute-0 nova_compute[187223]: 2025-11-28 17:59:01.270 187227 DEBUG nova.compute.manager [None req-e3a37cc1-7281-4b86-8d8b-469bc8869b71 - - - - - -] [instance: 5d30319e-5bea-48ef-ae70-71e215054fc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:59:01 compute-0 nova_compute[187223]: 2025-11-28 17:59:01.270 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: ERROR   17:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: ERROR   17:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: ERROR   17:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: ERROR   17:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: ERROR   17:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:59:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:59:05 compute-0 nova_compute[187223]: 2025-11-28 17:59:05.484 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:06 compute-0 nova_compute[187223]: 2025-11-28 17:59:06.279 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:07 compute-0 podman[218829]: 2025-11-28 17:59:07.255745182 +0000 UTC m=+0.117082493 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:59:09 compute-0 nova_compute[187223]: 2025-11-28 17:59:09.072 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:10 compute-0 nova_compute[187223]: 2025-11-28 17:59:10.488 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:11 compute-0 nova_compute[187223]: 2025-11-28 17:59:11.331 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:14 compute-0 podman[218846]: 2025-11-28 17:59:14.237916484 +0000 UTC m=+0.086877330 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:59:14 compute-0 podman[218847]: 2025-11-28 17:59:14.270536696 +0000 UTC m=+0.111580364 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:59:15 compute-0 nova_compute[187223]: 2025-11-28 17:59:15.491 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:16 compute-0 nova_compute[187223]: 2025-11-28 17:59:16.334 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:17 compute-0 podman[218890]: 2025-11-28 17:59:17.209805715 +0000 UTC m=+0.073110372 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 17:59:20 compute-0 nova_compute[187223]: 2025-11-28 17:59:20.493 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:21 compute-0 nova_compute[187223]: 2025-11-28 17:59:21.336 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:25 compute-0 nova_compute[187223]: 2025-11-28 17:59:25.497 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:26 compute-0 nova_compute[187223]: 2025-11-28 17:59:26.338 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:27.711 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:27.712 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:27.712 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:29 compute-0 podman[218911]: 2025-11-28 17:59:29.202013001 +0000 UTC m=+0.059290483 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 17:59:29 compute-0 podman[197556]: time="2025-11-28T17:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:59:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 17:59:29 compute-0 podman[197556]: @ - - [28/Nov/2025:17:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Nov 28 17:59:30 compute-0 nova_compute[187223]: 2025-11-28 17:59:30.500 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:31 compute-0 nova_compute[187223]: 2025-11-28 17:59:31.340 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: ERROR   17:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: ERROR   17:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: ERROR   17:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: ERROR   17:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: ERROR   17:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 17:59:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 17:59:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:31.543 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:59:31 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:31.544 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 17:59:31 compute-0 nova_compute[187223]: 2025-11-28 17:59:31.546 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:34 compute-0 sshd-session[218935]: Invalid user sol from 193.32.162.146 port 43924
Nov 28 17:59:34 compute-0 sshd-session[218935]: Connection closed by invalid user sol 193.32.162.146 port 43924 [preauth]
Nov 28 17:59:35 compute-0 nova_compute[187223]: 2025-11-28 17:59:35.501 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:36 compute-0 nova_compute[187223]: 2025-11-28 17:59:36.386 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:38 compute-0 podman[218937]: 2025-11-28 17:59:38.188494089 +0000 UTC m=+0.054202876 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 17:59:40 compute-0 nova_compute[187223]: 2025-11-28 17:59:40.504 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:41 compute-0 nova_compute[187223]: 2025-11-28 17:59:41.388 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:41 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:41.547 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:43 compute-0 ovn_controller[95574]: 2025-11-28T17:59:43Z|00207|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 17:59:45 compute-0 podman[218958]: 2025-11-28 17:59:45.210947035 +0000 UTC m=+0.063944669 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 17:59:45 compute-0 podman[218959]: 2025-11-28 17:59:45.235620457 +0000 UTC m=+0.087942701 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 17:59:45 compute-0 nova_compute[187223]: 2025-11-28 17:59:45.544 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:46 compute-0 nova_compute[187223]: 2025-11-28 17:59:46.389 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:47 compute-0 nova_compute[187223]: 2025-11-28 17:59:47.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:47 compute-0 nova_compute[187223]: 2025-11-28 17:59:47.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 17:59:48 compute-0 podman[219004]: 2025-11-28 17:59:48.19268151 +0000 UTC m=+0.054457424 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 28 17:59:49 compute-0 nova_compute[187223]: 2025-11-28 17:59:49.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:49 compute-0 nova_compute[187223]: 2025-11-28 17:59:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.546 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.761 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.761 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.781 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.861 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.861 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.867 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.868 187227 INFO nova.compute.claims [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Claim successful on node compute-0.ctlplane.example.com
Nov 28 17:59:50 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.975 187227 DEBUG nova.compute.provider_tree [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:50.999 187227 DEBUG nova.scheduler.client.report [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.032 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.033 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.120 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.121 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.153 187227 INFO nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.169 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.251 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.253 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.253 187227 INFO nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Creating image(s)
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.254 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.254 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.255 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.269 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.337 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.338 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.339 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.351 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.391 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.409 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.410 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.432 187227 DEBUG nova.policy [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2dadf919dd7945b5b71d057656849719', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1addffece1c14b2985592a03630de15d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.451 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.452 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.452 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.506 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.508 187227 DEBUG nova.virt.disk.api [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Checking if we can resize image /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.508 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.564 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.566 187227 DEBUG nova.virt.disk.api [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Cannot resize image /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.566 187227 DEBUG nova.objects.instance [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lazy-loading 'migration_context' on Instance uuid e1427d58-0afb-4202-a9d8-bf96366dadcb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.585 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.586 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Ensure instance console log exists: /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.586 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.587 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.587 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:51 compute-0 nova_compute[187223]: 2025-11-28 17:59:51.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:52 compute-0 nova_compute[187223]: 2025-11-28 17:59:52.038 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Successfully created port: ae66da5a-f808-44d2-91c9-c719c4875c6b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 17:59:53 compute-0 sshd-session[219040]: Invalid user support from 78.128.112.74 port 40962
Nov 28 17:59:53 compute-0 sshd-session[219040]: Connection closed by invalid user support 78.128.112.74 port 40962 [preauth]
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.944 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Successfully updated port: ae66da5a-f808-44d2-91c9-c719c4875c6b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.966 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.967 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquired lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:59:54 compute-0 nova_compute[187223]: 2025-11-28 17:59:54.967 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 17:59:55 compute-0 nova_compute[187223]: 2025-11-28 17:59:55.093 187227 DEBUG nova.compute.manager [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-changed-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:59:55 compute-0 nova_compute[187223]: 2025-11-28 17:59:55.093 187227 DEBUG nova.compute.manager [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Refreshing instance network info cache due to event network-changed-ae66da5a-f808-44d2-91c9-c719c4875c6b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 17:59:55 compute-0 nova_compute[187223]: 2025-11-28 17:59:55.094 187227 DEBUG oslo_concurrency.lockutils [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 17:59:55 compute-0 nova_compute[187223]: 2025-11-28 17:59:55.219 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 17:59:55 compute-0 nova_compute[187223]: 2025-11-28 17:59:55.547 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:56 compute-0 nova_compute[187223]: 2025-11-28 17:59:56.392 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.248 187227 DEBUG nova.network.neutron [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updating instance_info_cache with network_info: [{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.273 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Releasing lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.274 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Instance network_info: |[{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.274 187227 DEBUG oslo_concurrency.lockutils [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.274 187227 DEBUG nova.network.neutron [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Refreshing network info cache for port ae66da5a-f808-44d2-91c9-c719c4875c6b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.277 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Start _get_guest_xml network_info=[{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.281 187227 WARNING nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.287 187227 DEBUG nova.virt.libvirt.host [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.288 187227 DEBUG nova.virt.libvirt.host [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.295 187227 DEBUG nova.virt.libvirt.host [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.296 187227 DEBUG nova.virt.libvirt.host [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.297 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.298 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.298 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.299 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.299 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.299 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.300 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.300 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.300 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.301 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.301 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.301 187227 DEBUG nova.virt.hardware [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.306 187227 DEBUG nova.virt.libvirt.vif [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-320238220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-320238220',id=27,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1addffece1c14b2985592a03630de15d',ramdisk_id='',reservation_id='r-g4hohged',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:59:51Z,user_data=None,user_id='2dadf919dd7945b5b71d057656849719',uuid=e1427d58-0afb-4202-a9d8-bf96366dadcb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.307 187227 DEBUG nova.network.os_vif_util [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Converting VIF {"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.307 187227 DEBUG nova.network.os_vif_util [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.308 187227 DEBUG nova.objects.instance [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lazy-loading 'pci_devices' on Instance uuid e1427d58-0afb-4202-a9d8-bf96366dadcb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.328 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] End _get_guest_xml xml=<domain type="kvm">
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <uuid>e1427d58-0afb-4202-a9d8-bf96366dadcb</uuid>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <name>instance-0000001b</name>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <metadata>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-320238220</nova:name>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 17:59:57</nova:creationTime>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:user uuid="2dadf919dd7945b5b71d057656849719">tempest-TestExecuteWorkloadBalancingStrategy-1251277952-project-member</nova:user>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:project uuid="1addffece1c14b2985592a03630de15d">tempest-TestExecuteWorkloadBalancingStrategy-1251277952</nova:project>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         <nova:port uuid="ae66da5a-f808-44d2-91c9-c719c4875c6b">
Nov 28 17:59:57 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </metadata>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <system>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="serial">e1427d58-0afb-4202-a9d8-bf96366dadcb</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="uuid">e1427d58-0afb-4202-a9d8-bf96366dadcb</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </system>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <os>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </os>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <features>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <apic/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </features>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </clock>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </cpu>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   <devices>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.config"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </disk>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:0d:bc:27"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <target dev="tapae66da5a-f8"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </interface>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/console.log" append="off"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </serial>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <video>
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </video>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </rng>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 17:59:57 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 17:59:57 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 17:59:57 compute-0 nova_compute[187223]:   </devices>
Nov 28 17:59:57 compute-0 nova_compute[187223]: </domain>
Nov 28 17:59:57 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.329 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Preparing to wait for external event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.330 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.330 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.330 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.331 187227 DEBUG nova.virt.libvirt.vif [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T17:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-320238220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-320238220',id=27,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1addffece1c14b2985592a03630de15d',ramdisk_id='',reservation_id='r-g4hohged',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T17:59:51Z,user_data=None,user_id='2dadf919dd7945b5b71d057656849719',uuid=e1427d58-0afb-4202-a9d8-bf96366dadcb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.331 187227 DEBUG nova.network.os_vif_util [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Converting VIF {"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.332 187227 DEBUG nova.network.os_vif_util [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.332 187227 DEBUG os_vif [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.332 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.333 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.333 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.336 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.337 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae66da5a-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.337 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae66da5a-f8, col_values=(('external_ids', {'iface-id': 'ae66da5a-f808-44d2-91c9-c719c4875c6b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:bc:27', 'vm-uuid': 'e1427d58-0afb-4202-a9d8-bf96366dadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.339 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:57 compute-0 NetworkManager[55763]: <info>  [1764352797.3406] manager: (tapae66da5a-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.341 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.348 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.349 187227 INFO os_vif [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8')
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.407 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.407 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.407 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] No VIF found with MAC fa:16:3e:0d:bc:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 17:59:57 compute-0 nova_compute[187223]: 2025-11-28 17:59:57.408 187227 INFO nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Using config drive
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.449 187227 INFO nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Creating config drive at /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.config
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.459 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwiclrmp2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.600 187227 DEBUG oslo_concurrency.processutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwiclrmp2" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:58 compute-0 kernel: tapae66da5a-f8: entered promiscuous mode
Nov 28 17:59:58 compute-0 NetworkManager[55763]: <info>  [1764352798.6657] manager: (tapae66da5a-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 28 17:59:58 compute-0 ovn_controller[95574]: 2025-11-28T17:59:58Z|00208|binding|INFO|Claiming lport ae66da5a-f808-44d2-91c9-c719c4875c6b for this chassis.
Nov 28 17:59:58 compute-0 ovn_controller[95574]: 2025-11-28T17:59:58Z|00209|binding|INFO|ae66da5a-f808-44d2-91c9-c719c4875c6b: Claiming fa:16:3e:0d:bc:27 10.100.0.6
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.667 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.671 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.674 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.683 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:bc:27 10.100.0.6'], port_security=['fa:16:3e:0d:bc:27 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e1427d58-0afb-4202-a9d8-bf96366dadcb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1addffece1c14b2985592a03630de15d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab4eb6a5-3b24-4c4a-b394-e5815369da3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac92fc25-d2cf-40c4-b3c4-2d64e6fa9b00, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=ae66da5a-f808-44d2-91c9-c719c4875c6b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.684 104433 INFO neutron.agent.ovn.metadata.agent [-] Port ae66da5a-f808-44d2-91c9-c719c4875c6b in datapath 4d0c0754-c656-44c3-a3fd-a5251e8d9208 bound to our chassis
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.686 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d0c0754-c656-44c3-a3fd-a5251e8d9208
Nov 28 17:59:58 compute-0 systemd-udevd[219062]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.704 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[323ca90b-2987-4837-9e7d-4f35cb84bcb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.705 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d0c0754-c1 in ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.707 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.707 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.708 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.708 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.707 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d0c0754-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.707 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[cd9c096b-abd3-444d-982a-38e1f5381ab6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.708 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c90b32-f3d8-4338-9b4b-7323b2725ff7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 NetworkManager[55763]: <info>  [1764352798.7105] device (tapae66da5a-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 17:59:58 compute-0 NetworkManager[55763]: <info>  [1764352798.7113] device (tapae66da5a-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 17:59:58 compute-0 systemd-machined[153517]: New machine qemu-20-instance-0000001b.
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.724 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea99506-522f-4a7d-b2ab-9dafac2d0086]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.725 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:58 compute-0 ovn_controller[95574]: 2025-11-28T17:59:58Z|00210|binding|INFO|Setting lport ae66da5a-f808-44d2-91c9-c719c4875c6b ovn-installed in OVS
Nov 28 17:59:58 compute-0 ovn_controller[95574]: 2025-11-28T17:59:58Z|00211|binding|INFO|Setting lport ae66da5a-f808-44d2-91c9-c719c4875c6b up in Southbound
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.730 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:58 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001b.
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.741 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9cdcd3ca-3b24-4638-b409-108bb28fd835]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.775 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[7875333b-4ac5-40e1-abb8-2b042fa4ea5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.780 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b634f3ab-a014-4d1a-97f8-96a4eaef87d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 NetworkManager[55763]: <info>  [1764352798.7810] manager: (tap4d0c0754-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Nov 28 17:59:58 compute-0 systemd-udevd[219065]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.816 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf68ad3-9a25-49c5-bbed-de72c7e744dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.819 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[11e867b4-1262-4d60-ad4e-ea007cae52f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 NetworkManager[55763]: <info>  [1764352798.8417] device (tap4d0c0754-c0): carrier: link connected
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.848 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[d06e5602-a5f1-4000-832b-c91cea04bebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.854 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.864 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bcefa429-4588-434e-b659-33a4773302af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d0c0754-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:5c:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597347, 'reachable_time': 33608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219095, 'error': None, 'target': 'ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.879 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd0d1bb-bd6a-41b6-80aa-0dc00953d7b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febc:5cda'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 597347, 'tstamp': 597347}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219097, 'error': None, 'target': 'ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.896 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[bbad1b90-13cf-4100-9745-ede557d95cd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d0c0754-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bc:5c:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597347, 'reachable_time': 33608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219098, 'error': None, 'target': 'ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.919 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.921 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 17:59:58 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:58.939 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[a68326d3-d8e3-4567-b752-8b2ba5b1221a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:58 compute-0 nova_compute[187223]: 2025-11-28 17:59:58.974 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.003 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d30725-074d-4ee1-82a7-30044721e47e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.004 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d0c0754-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.005 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.005 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d0c0754-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:59 compute-0 NetworkManager[55763]: <info>  [1764352799.0080] manager: (tap4d0c0754-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Nov 28 17:59:59 compute-0 kernel: tap4d0c0754-c0: entered promiscuous mode
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.011 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d0c0754-c0, col_values=(('external_ids', {'iface-id': '90bdb5ac-c485-4d15-9cae-2c0dc516b8d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 17:59:59 compute-0 ovn_controller[95574]: 2025-11-28T17:59:59Z|00212|binding|INFO|Releasing lport 90bdb5ac-c485-4d15-9cae-2c0dc516b8d9 from this chassis (sb_readonly=0)
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.014 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d0c0754-c656-44c3-a3fd-a5251e8d9208.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d0c0754-c656-44c3-a3fd-a5251e8d9208.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.015 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[60abb036-7911-4e1d-98fd-bbfe78b51fec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.018 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: global
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-4d0c0754-c656-44c3-a3fd-a5251e8d9208
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/4d0c0754-c656-44c3-a3fd-a5251e8d9208.pid.haproxy
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 4d0c0754-c656-44c3-a3fd-a5251e8d9208
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 17:59:59 compute-0 ovn_metadata_agent[104428]: 2025-11-28 17:59:59.019 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'env', 'PROCESS_TAG=haproxy-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d0c0754-c656-44c3-a3fd-a5251e8d9208.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.023 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.168 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.169 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.34057998657227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.170 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.170 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.238 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance e1427d58-0afb-4202-a9d8-bf96366dadcb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.239 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.239 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.393 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.419 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.472 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.472 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:59 compute-0 podman[219135]: 2025-11-28 17:59:59.519540442 +0000 UTC m=+0.080846936 container create 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.525 187227 DEBUG nova.compute.manager [req-7729c9fd-ae96-4bda-ab02-1745bff6b24a req-60f6c363-2e8f-423a-aeaf-0c79bcdae4b7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.525 187227 DEBUG oslo_concurrency.lockutils [req-7729c9fd-ae96-4bda-ab02-1745bff6b24a req-60f6c363-2e8f-423a-aeaf-0c79bcdae4b7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.526 187227 DEBUG oslo_concurrency.lockutils [req-7729c9fd-ae96-4bda-ab02-1745bff6b24a req-60f6c363-2e8f-423a-aeaf-0c79bcdae4b7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.526 187227 DEBUG oslo_concurrency.lockutils [req-7729c9fd-ae96-4bda-ab02-1745bff6b24a req-60f6c363-2e8f-423a-aeaf-0c79bcdae4b7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.526 187227 DEBUG nova.compute.manager [req-7729c9fd-ae96-4bda-ab02-1745bff6b24a req-60f6c363-2e8f-423a-aeaf-0c79bcdae4b7 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Processing event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 17:59:59 compute-0 systemd[1]: Started libpod-conmon-58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec.scope.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.556 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.557 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352799.5570621, e1427d58-0afb-4202-a9d8-bf96366dadcb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.558 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] VM Started (Lifecycle Event)
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.564 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.567 187227 INFO nova.virt.libvirt.driver [-] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Instance spawned successfully.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.568 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 17:59:59 compute-0 podman[219135]: 2025-11-28 17:59:59.479509245 +0000 UTC m=+0.040815789 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.583 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:59:59 compute-0 systemd[1]: Started libcrun container.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.590 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:59:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0ed8eb919306def3ed797a4d06617a1186ee2560ba82c64337421c6c721180d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.595 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.595 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.596 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.596 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.597 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.597 187227 DEBUG nova.virt.libvirt.driver [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 17:59:59 compute-0 podman[219135]: 2025-11-28 17:59:59.606769451 +0000 UTC m=+0.168075965 container init 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 17:59:59 compute-0 podman[219135]: 2025-11-28 17:59:59.614133114 +0000 UTC m=+0.175439578 container start 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 17:59:59 compute-0 podman[219155]: 2025-11-28 17:59:59.621318461 +0000 UTC m=+0.062193227 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.628 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.628 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352799.557308, e1427d58-0afb-4202-a9d8-bf96366dadcb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.628 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] VM Paused (Lifecycle Event)
Nov 28 17:59:59 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [NOTICE]   (219181) : New worker (219186) forked
Nov 28 17:59:59 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [NOTICE]   (219181) : Loading success.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.654 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.659 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352799.5596874, e1427d58-0afb-4202-a9d8-bf96366dadcb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.659 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] VM Resumed (Lifecycle Event)
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.678 187227 INFO nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Took 8.43 seconds to spawn the instance on the hypervisor.
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.679 187227 DEBUG nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.680 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.689 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.729 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 17:59:59 compute-0 podman[197556]: time="2025-11-28T17:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 17:59:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.793 187227 INFO nova.compute.manager [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Took 8.95 seconds to build instance.
Nov 28 17:59:59 compute-0 podman[197556]: @ - - [28/Nov/2025:17:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 28 17:59:59 compute-0 nova_compute[187223]: 2025-11-28 17:59:59.819 187227 DEBUG oslo_concurrency.lockutils [None req-409f4823-ed7f-4a98-aabc-655356e86164 2dadf919dd7945b5b71d057656849719 1addffece1c14b2985592a03630de15d - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:00 compute-0 nova_compute[187223]: 2025-11-28 18:00:00.467 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:00 compute-0 nova_compute[187223]: 2025-11-28 18:00:00.468 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:00 compute-0 nova_compute[187223]: 2025-11-28 18:00:00.549 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:00 compute-0 nova_compute[187223]: 2025-11-28 18:00:00.962 187227 DEBUG nova.network.neutron [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updated VIF entry in instance network info cache for port ae66da5a-f808-44d2-91c9-c719c4875c6b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 18:00:00 compute-0 nova_compute[187223]: 2025-11-28 18:00:00.962 187227 DEBUG nova.network.neutron [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updating instance_info_cache with network_info: [{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:00:01 compute-0 nova_compute[187223]: 2025-11-28 18:00:01.003 187227 DEBUG oslo_concurrency.lockutils [req-bb5eaaf9-abed-4bd9-a74e-163ce3779616 req-f63787ed-b5a0-4e2c-9a55-a5b762a62d99 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: ERROR   18:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: ERROR   18:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: ERROR   18:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: ERROR   18:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: ERROR   18:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:00:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:00:02 compute-0 nova_compute[187223]: 2025-11-28 18:00:02.340 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.801 187227 DEBUG nova.compute.manager [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.802 187227 DEBUG oslo_concurrency.lockutils [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.802 187227 DEBUG oslo_concurrency.lockutils [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.802 187227 DEBUG oslo_concurrency.lockutils [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.803 187227 DEBUG nova.compute.manager [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:04 compute-0 nova_compute[187223]: 2025-11-28 18:00:04.803 187227 WARNING nova.compute.manager [req-77ef1c70-9bc9-4895-9876-467062705ea4 req-5d1f217a-bebe-4ca6-8569-2ee3362d5cea 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received unexpected event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with vm_state active and task_state None.
Nov 28 18:00:05 compute-0 nova_compute[187223]: 2025-11-28 18:00:05.550 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:07 compute-0 nova_compute[187223]: 2025-11-28 18:00:07.343 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:09 compute-0 podman[219195]: 2025-11-28 18:00:09.236033623 +0000 UTC m=+0.083747950 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 18:00:10 compute-0 nova_compute[187223]: 2025-11-28 18:00:10.554 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:12 compute-0 nova_compute[187223]: 2025-11-28 18:00:12.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:12 compute-0 ovn_controller[95574]: 2025-11-28T18:00:12Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:bc:27 10.100.0.6
Nov 28 18:00:12 compute-0 ovn_controller[95574]: 2025-11-28T18:00:12Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:bc:27 10.100.0.6
Nov 28 18:00:15 compute-0 nova_compute[187223]: 2025-11-28 18:00:15.556 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:16 compute-0 podman[219223]: 2025-11-28 18:00:16.221896632 +0000 UTC m=+0.082257637 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible)
Nov 28 18:00:16 compute-0 podman[219224]: 2025-11-28 18:00:16.260287801 +0000 UTC m=+0.111026578 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 18:00:17 compute-0 nova_compute[187223]: 2025-11-28 18:00:17.350 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:18 compute-0 sshd-session[219268]: Invalid user solana from 193.32.162.145 port 55326
Nov 28 18:00:18 compute-0 podman[219270]: 2025-11-28 18:00:18.468172577 +0000 UTC m=+0.084802490 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 18:00:18 compute-0 sshd-session[219268]: Connection closed by invalid user solana 193.32.162.145 port 55326 [preauth]
Nov 28 18:00:20 compute-0 nova_compute[187223]: 2025-11-28 18:00:20.559 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:22 compute-0 nova_compute[187223]: 2025-11-28 18:00:22.354 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:23 compute-0 nova_compute[187223]: 2025-11-28 18:00:23.942 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Check if temp file /var/lib/nova/instances/tmp4ga1fkag exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 18:00:23 compute-0 nova_compute[187223]: 2025-11-28 18:00:23.943 187227 DEBUG nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4ga1fkag',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e1427d58-0afb-4202-a9d8-bf96366dadcb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 18:00:24 compute-0 nova_compute[187223]: 2025-11-28 18:00:24.900 187227 DEBUG oslo_concurrency.processutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:00:24 compute-0 nova_compute[187223]: 2025-11-28 18:00:24.964 187227 DEBUG oslo_concurrency.processutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:00:24 compute-0 nova_compute[187223]: 2025-11-28 18:00:24.966 187227 DEBUG oslo_concurrency.processutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:00:25 compute-0 nova_compute[187223]: 2025-11-28 18:00:25.041 187227 DEBUG oslo_concurrency.processutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:00:25 compute-0 nova_compute[187223]: 2025-11-28 18:00:25.561 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:27 compute-0 nova_compute[187223]: 2025-11-28 18:00:27.358 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:27.713 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:27.714 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:27.715 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:29 compute-0 podman[197556]: time="2025-11-28T18:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:00:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 18:00:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3060 "" "Go-http-client/1.1"
Nov 28 18:00:30 compute-0 podman[219297]: 2025-11-28 18:00:30.180554996 +0000 UTC m=+0.048211608 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 18:00:30 compute-0 nova_compute[187223]: 2025-11-28 18:00:30.562 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: ERROR   18:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: ERROR   18:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: ERROR   18:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: ERROR   18:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: ERROR   18:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:00:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:00:32 compute-0 nova_compute[187223]: 2025-11-28 18:00:32.360 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:32 compute-0 sshd-session[219321]: Accepted publickey for nova from 192.168.122.101 port 37106 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 18:00:32 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 18:00:32 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 18:00:32 compute-0 systemd-logind[788]: New session 43 of user nova.
Nov 28 18:00:32 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 18:00:32 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 18:00:32 compute-0 systemd[219325]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 18:00:32 compute-0 systemd[219325]: Queued start job for default target Main User Target.
Nov 28 18:00:32 compute-0 systemd[219325]: Created slice User Application Slice.
Nov 28 18:00:32 compute-0 systemd[219325]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 18:00:32 compute-0 systemd[219325]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 18:00:32 compute-0 systemd[219325]: Reached target Paths.
Nov 28 18:00:32 compute-0 systemd[219325]: Reached target Timers.
Nov 28 18:00:32 compute-0 systemd[219325]: Starting D-Bus User Message Bus Socket...
Nov 28 18:00:32 compute-0 systemd[219325]: Starting Create User's Volatile Files and Directories...
Nov 28 18:00:32 compute-0 systemd[219325]: Finished Create User's Volatile Files and Directories.
Nov 28 18:00:32 compute-0 systemd[219325]: Listening on D-Bus User Message Bus Socket.
Nov 28 18:00:32 compute-0 systemd[219325]: Reached target Sockets.
Nov 28 18:00:32 compute-0 systemd[219325]: Reached target Basic System.
Nov 28 18:00:32 compute-0 systemd[219325]: Reached target Main User Target.
Nov 28 18:00:32 compute-0 systemd[219325]: Startup finished in 126ms.
Nov 28 18:00:32 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 18:00:32 compute-0 systemd[1]: Started Session 43 of User nova.
Nov 28 18:00:32 compute-0 sshd-session[219321]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 18:00:33 compute-0 sshd-session[219340]: Received disconnect from 192.168.122.101 port 37106:11: disconnected by user
Nov 28 18:00:33 compute-0 sshd-session[219340]: Disconnected from user nova 192.168.122.101 port 37106
Nov 28 18:00:33 compute-0 sshd-session[219321]: pam_unix(sshd:session): session closed for user nova
Nov 28 18:00:33 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Nov 28 18:00:33 compute-0 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Nov 28 18:00:33 compute-0 systemd-logind[788]: Removed session 43.
Nov 28 18:00:33 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.090 187227 DEBUG nova.compute.manager [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.091 187227 DEBUG oslo_concurrency.lockutils [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.091 187227 DEBUG oslo_concurrency.lockutils [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.091 187227 DEBUG oslo_concurrency.lockutils [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.092 187227 DEBUG nova.compute.manager [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.092 187227 DEBUG nova.compute.manager [req-897f09dc-d270-44ee-9e9c-58ae56576be0 req-4bc8bdb4-f1ac-4810-9263-38fe5026e8a9 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 18:00:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:34.465 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:00:34 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:34.466 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.466 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.901 187227 INFO nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Took 9.86 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.902 187227 DEBUG nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.930 187227 DEBUG nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4ga1fkag',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e1427d58-0afb-4202-a9d8-bf96366dadcb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(49fd9b1d-0c1b-4834-9bec-e6d653048c6d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.970 187227 DEBUG nova.objects.instance [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lazy-loading 'migration_context' on Instance uuid e1427d58-0afb-4202-a9d8-bf96366dadcb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.971 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.973 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.973 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.993 187227 DEBUG nova.virt.libvirt.vif [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-320238220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-320238220',id=27,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:59:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1addffece1c14b2985592a03630de15d',ramdisk_id='',reservation_id='r-g4hohged',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T17:59:59Z,user_data=None,user_id='2dadf919dd7945b5b71d057656849719',uuid=e1427d58-0afb-4202-a9d8-bf96366dadcb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.993 187227 DEBUG nova.network.os_vif_util [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converting VIF {"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.994 187227 DEBUG nova.network.os_vif_util [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.994 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 18:00:34 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:0d:bc:27"/>
Nov 28 18:00:34 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 18:00:34 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 18:00:34 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 18:00:34 compute-0 nova_compute[187223]:   <target dev="tapae66da5a-f8"/>
Nov 28 18:00:34 compute-0 nova_compute[187223]: </interface>
Nov 28 18:00:34 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 18:00:34 compute-0 nova_compute[187223]: 2025-11-28 18:00:34.995 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 18:00:35 compute-0 nova_compute[187223]: 2025-11-28 18:00:35.477 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:35 compute-0 nova_compute[187223]: 2025-11-28 18:00:35.478 187227 INFO nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 18:00:35 compute-0 nova_compute[187223]: 2025-11-28 18:00:35.567 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:35 compute-0 nova_compute[187223]: 2025-11-28 18:00:35.590 187227 INFO nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.093 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.094 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.207 187227 DEBUG nova.compute.manager [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.208 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.208 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.208 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.208 187227 DEBUG nova.compute.manager [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.209 187227 WARNING nova.compute.manager [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received unexpected event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with vm_state active and task_state migrating.
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.209 187227 DEBUG nova.compute.manager [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-changed-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.209 187227 DEBUG nova.compute.manager [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Refreshing instance network info cache due to event network-changed-ae66da5a-f808-44d2-91c9-c719c4875c6b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.209 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.209 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.210 187227 DEBUG nova.network.neutron [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Refreshing network info cache for port ae66da5a-f808-44d2-91c9-c719c4875c6b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.597 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:36 compute-0 nova_compute[187223]: 2025-11-28 18:00:36.598 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:37 compute-0 nova_compute[187223]: 2025-11-28 18:00:37.101 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:37 compute-0 nova_compute[187223]: 2025-11-28 18:00:37.102 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:37 compute-0 nova_compute[187223]: 2025-11-28 18:00:37.385 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:37 compute-0 nova_compute[187223]: 2025-11-28 18:00:37.606 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:37 compute-0 nova_compute[187223]: 2025-11-28 18:00:37.606 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.112 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.114 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:38 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:38.468 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.475 187227 DEBUG nova.network.neutron [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updated VIF entry in instance network info cache for port ae66da5a-f808-44d2-91c9-c719c4875c6b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.475 187227 DEBUG nova.network.neutron [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Updating instance_info_cache with network_info: [{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.497 187227 DEBUG oslo_concurrency.lockutils [req-e1e7c17e-c0c2-40ed-86c3-1e94c51c024c req-a3193595-b8c4-48a4-ba14-17d429b7b4b8 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-e1427d58-0afb-4202-a9d8-bf96366dadcb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.618 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:38 compute-0 nova_compute[187223]: 2025-11-28 18:00:38.618 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.122 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.123 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.531 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352839.5314445, e1427d58-0afb-4202-a9d8-bf96366dadcb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.532 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] VM Paused (Lifecycle Event)
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.552 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.556 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.576 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.626 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.627 187227 DEBUG nova.virt.libvirt.migration [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:00:39 compute-0 ovn_controller[95574]: 2025-11-28T18:00:39Z|00213|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 18:00:39 compute-0 kernel: tapae66da5a-f8 (unregistering): left promiscuous mode
Nov 28 18:00:39 compute-0 NetworkManager[55763]: <info>  [1764352839.7903] device (tapae66da5a-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 18:00:39 compute-0 ovn_controller[95574]: 2025-11-28T18:00:39Z|00214|binding|INFO|Releasing lport ae66da5a-f808-44d2-91c9-c719c4875c6b from this chassis (sb_readonly=0)
Nov 28 18:00:39 compute-0 ovn_controller[95574]: 2025-11-28T18:00:39Z|00215|binding|INFO|Setting lport ae66da5a-f808-44d2-91c9-c719c4875c6b down in Southbound
Nov 28 18:00:39 compute-0 ovn_controller[95574]: 2025-11-28T18:00:39Z|00216|binding|INFO|Removing iface tapae66da5a-f8 ovn-installed in OVS
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.847 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.849 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:39.857 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:bc:27 10.100.0.6'], port_security=['fa:16:3e:0d:bc:27 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e1427d58-0afb-4202-a9d8-bf96366dadcb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1addffece1c14b2985592a03630de15d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ab4eb6a5-3b24-4c4a-b394-e5815369da3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac92fc25-d2cf-40c4-b3c4-2d64e6fa9b00, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=ae66da5a-f808-44d2-91c9-c719c4875c6b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:00:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:39.858 104433 INFO neutron.agent.ovn.metadata.agent [-] Port ae66da5a-f808-44d2-91c9-c719c4875c6b in datapath 4d0c0754-c656-44c3-a3fd-a5251e8d9208 unbound from our chassis
Nov 28 18:00:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:39.860 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d0c0754-c656-44c3-a3fd-a5251e8d9208, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 18:00:39 compute-0 nova_compute[187223]: 2025-11-28 18:00:39.861 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:39.862 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[78b2341f-b596-4a9a-a665-4662cc2b48fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:39 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:39.862 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208 namespace which is not needed anymore
Nov 28 18:00:39 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Nov 28 18:00:39 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Consumed 14.472s CPU time.
Nov 28 18:00:39 compute-0 systemd-machined[153517]: Machine qemu-20-instance-0000001b terminated.
Nov 28 18:00:39 compute-0 podman[219349]: 2025-11-28 18:00:39.937803919 +0000 UTC m=+0.076181299 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [NOTICE]   (219181) : haproxy version is 2.8.14-c23fe91
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [NOTICE]   (219181) : path to executable is /usr/sbin/haproxy
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [WARNING]  (219181) : Exiting Master process...
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [WARNING]  (219181) : Exiting Master process...
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [ALERT]    (219181) : Current worker (219186) exited with code 143 (Terminated)
Nov 28 18:00:39 compute-0 neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208[219158]: [WARNING]  (219181) : All workers exited. Exiting... (0)
Nov 28 18:00:39 compute-0 systemd[1]: libpod-58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec.scope: Deactivated successfully.
Nov 28 18:00:39 compute-0 podman[219390]: 2025-11-28 18:00:39.989969445 +0000 UTC m=+0.042768845 container died 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 18:00:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec-userdata-shm.mount: Deactivated successfully.
Nov 28 18:00:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-c0ed8eb919306def3ed797a4d06617a1186ee2560ba82c64337421c6c721180d-merged.mount: Deactivated successfully.
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.027 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.027 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.028 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 18:00:40 compute-0 podman[219390]: 2025-11-28 18:00:40.028879674 +0000 UTC m=+0.081679064 container cleanup 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 18:00:40 compute-0 systemd[1]: libpod-conmon-58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec.scope: Deactivated successfully.
Nov 28 18:00:40 compute-0 podman[219437]: 2025-11-28 18:00:40.096148323 +0000 UTC m=+0.044062584 container remove 58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.101 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b948ce51-f37e-4a03-968c-cc7fffe5b710]: (4, ('Fri Nov 28 06:00:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208 (58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec)\n58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec\nFri Nov 28 06:00:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208 (58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec)\n58d23d797a9cea393f4d2a208e3332aebe5b455061fcfbf629b20560e0e580ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.104 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6d50c9-7d78-459c-8fb5-d4801b6afa58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.105 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d0c0754-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.107 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:40 compute-0 kernel: tap4d0c0754-c0: left promiscuous mode
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.123 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.124 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.126 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1e87128a-d9aa-4edc-bc8f-aefdbd1e3a35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.129 187227 DEBUG nova.virt.libvirt.guest [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e1427d58-0afb-4202-a9d8-bf96366dadcb' (instance-0000001b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.129 187227 INFO nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migration operation has completed
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.130 187227 INFO nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] _post_live_migration() is started..
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.148 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[32626126-c56c-4043-ab42-fbf5bba1b0eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.150 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[7b71dbea-70e4-4e3c-a327-12fd8ed31738]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.170 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[8c658fa7-9264-43c4-b650-12bb0449e65f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 597340, 'reachable_time': 22820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219456, 'error': None, 'target': 'ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.175 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d0c0754-c656-44c3-a3fd-a5251e8d9208 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 18:00:40 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:00:40.175 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd081aa-4d71-4c98-98df-6cfd660b7b71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:00:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d4d0c0754\x2dc656\x2d44c3\x2da3fd\x2da5251e8d9208.mount: Deactivated successfully.
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.569 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.683 187227 DEBUG nova.compute.manager [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.684 187227 DEBUG oslo_concurrency.lockutils [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.684 187227 DEBUG oslo_concurrency.lockutils [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.684 187227 DEBUG oslo_concurrency.lockutils [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.685 187227 DEBUG nova.compute.manager [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.685 187227 DEBUG nova.compute.manager [req-7b8fa8a6-23cf-4011-8aa0-3f8b7c5a2fc1 req-2cc230fe-daf7-443a-8199-7f1ec02f2425 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-unplugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.934 187227 DEBUG nova.network.neutron [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Activated binding for port ae66da5a-f808-44d2-91c9-c719c4875c6b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.934 187227 DEBUG nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.935 187227 DEBUG nova.virt.libvirt.vif [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T17:59:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-320238220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-320238220',id=27,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T17:59:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1addffece1c14b2985592a03630de15d',ramdisk_id='',reservation_id='r-g4hohged',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-1251277952-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T18:00:21Z,user_data=None,user_id='2dadf919dd7945b5b71d057656849719',uuid=e1427d58-0afb-4202-a9d8-bf96366dadcb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.935 187227 DEBUG nova.network.os_vif_util [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converting VIF {"id": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "address": "fa:16:3e:0d:bc:27", "network": {"id": "4d0c0754-c656-44c3-a3fd-a5251e8d9208", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1505296826-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1addffece1c14b2985592a03630de15d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae66da5a-f8", "ovs_interfaceid": "ae66da5a-f808-44d2-91c9-c719c4875c6b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.936 187227 DEBUG nova.network.os_vif_util [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.936 187227 DEBUG os_vif [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.938 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.938 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae66da5a-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.942 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.944 187227 INFO os_vif [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:bc:27,bridge_name='br-int',has_traffic_filtering=True,id=ae66da5a-f808-44d2-91c9-c719c4875c6b,network=Network(4d0c0754-c656-44c3-a3fd-a5251e8d9208),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae66da5a-f8')
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.944 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.945 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.945 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.945 187227 DEBUG nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.945 187227 INFO nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Deleting instance files /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb_del
Nov 28 18:00:40 compute-0 nova_compute[187223]: 2025-11-28 18:00:40.946 187227 INFO nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Deletion of /var/lib/nova/instances/e1427d58-0afb-4202-a9d8-bf96366dadcb_del complete
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.780 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.780 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.781 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.781 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.781 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.782 187227 WARNING nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received unexpected event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with vm_state active and task_state migrating.
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.782 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.782 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.783 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.783 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.783 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.784 187227 WARNING nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received unexpected event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with vm_state active and task_state migrating.
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.784 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.784 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.785 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.785 187227 DEBUG oslo_concurrency.lockutils [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.785 187227 DEBUG nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] No waiting events found dispatching network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:00:42 compute-0 nova_compute[187223]: 2025-11-28 18:00:42.785 187227 WARNING nova.compute.manager [req-e1c615e0-7a30-4ed3-9ada-258fb42e601e req-357e9371-7470-4ad6-961a-fdfe8cf9b955 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Received unexpected event network-vif-plugged-ae66da5a-f808-44d2-91c9-c719c4875c6b for instance with vm_state active and task_state migrating.
Nov 28 18:00:43 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 18:00:43 compute-0 systemd[219325]: Activating special unit Exit the Session...
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped target Main User Target.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped target Basic System.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped target Paths.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped target Sockets.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped target Timers.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 18:00:43 compute-0 systemd[219325]: Closed D-Bus User Message Bus Socket.
Nov 28 18:00:43 compute-0 systemd[219325]: Stopped Create User's Volatile Files and Directories.
Nov 28 18:00:43 compute-0 systemd[219325]: Removed slice User Application Slice.
Nov 28 18:00:43 compute-0 systemd[219325]: Reached target Shutdown.
Nov 28 18:00:43 compute-0 systemd[219325]: Finished Exit the Session.
Nov 28 18:00:43 compute-0 systemd[219325]: Reached target Exit the Session.
Nov 28 18:00:43 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 18:00:43 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 18:00:43 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 18:00:43 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 18:00:43 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 18:00:43 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 18:00:43 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 18:00:45 compute-0 nova_compute[187223]: 2025-11-28 18:00:45.607 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:45 compute-0 nova_compute[187223]: 2025-11-28 18:00:45.941 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:47 compute-0 podman[219459]: 2025-11-28 18:00:47.211046995 +0000 UTC m=+0.070909371 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 18:00:47 compute-0 podman[219460]: 2025-11-28 18:00:47.248918452 +0000 UTC m=+0.102401517 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.340 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.341 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.341 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "e1427d58-0afb-4202-a9d8-bf96366dadcb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.372 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.373 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.374 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.374 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.601 187227 WARNING nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.603 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.34137725830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.603 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.603 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.651 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Migration for instance e1427d58-0afb-4202-a9d8-bf96366dadcb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.682 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.718 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Migration 49fd9b1d-0c1b-4834-9bec-e6d653048c6d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.719 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.719 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.771 187227 DEBUG nova.compute.provider_tree [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.788 187227 DEBUG nova.scheduler.client.report [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.817 187227 DEBUG nova.compute.resource_tracker [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.818 187227 DEBUG oslo_concurrency.lockutils [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.822 187227 INFO nova.compute.manager [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.926 187227 INFO nova.scheduler.client.report [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] Deleted allocation for migration 49fd9b1d-0c1b-4834-9bec-e6d653048c6d
Nov 28 18:00:47 compute-0 nova_compute[187223]: 2025-11-28 18:00:47.926 187227 DEBUG nova.virt.libvirt.driver [None req-f053c2b9-bae3-4f32-a0dc-680803bb044f 250dbacf140948738c5c9daa62c93691 ca3b936304c947f98c618f4bf25dee0e - - default default] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 18:00:48 compute-0 nova_compute[187223]: 2025-11-28 18:00:48.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:48 compute-0 nova_compute[187223]: 2025-11-28 18:00:48.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 18:00:49 compute-0 podman[219504]: 2025-11-28 18:00:49.217682356 +0000 UTC m=+0.068848539 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 18:00:49 compute-0 nova_compute[187223]: 2025-11-28 18:00:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:50 compute-0 nova_compute[187223]: 2025-11-28 18:00:50.610 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:50 compute-0 nova_compute[187223]: 2025-11-28 18:00:50.944 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:51 compute-0 nova_compute[187223]: 2025-11-28 18:00:51.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:52 compute-0 nova_compute[187223]: 2025-11-28 18:00:52.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:55 compute-0 nova_compute[187223]: 2025-11-28 18:00:55.025 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352840.0245845, e1427d58-0afb-4202-a9d8-bf96366dadcb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:00:55 compute-0 nova_compute[187223]: 2025-11-28 18:00:55.025 187227 INFO nova.compute.manager [-] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] VM Stopped (Lifecycle Event)
Nov 28 18:00:55 compute-0 nova_compute[187223]: 2025-11-28 18:00:55.612 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:55 compute-0 nova_compute[187223]: 2025-11-28 18:00:55.872 187227 DEBUG nova.compute.manager [None req-9e23b456-7588-447e-8625-71cb7797ae73 - - - - - -] [instance: e1427d58-0afb-4202-a9d8-bf96366dadcb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:00:55 compute-0 nova_compute[187223]: 2025-11-28 18:00:55.946 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:00:56 compute-0 nova_compute[187223]: 2025-11-28 18:00:56.686 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:56 compute-0 nova_compute[187223]: 2025-11-28 18:00:56.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 18:00:56 compute-0 nova_compute[187223]: 2025-11-28 18:00:56.687 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 18:00:56 compute-0 nova_compute[187223]: 2025-11-28 18:00:56.705 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.703 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.703 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:00:59 compute-0 podman[197556]: time="2025-11-28T18:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:00:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:00:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.846 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.847 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5877MB free_disk=73.34139633178711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.847 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.847 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.926 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.927 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.958 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.980 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.981 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:00:59 compute-0 nova_compute[187223]: 2025-11-28 18:00:59.981 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:01:00 compute-0 nova_compute[187223]: 2025-11-28 18:01:00.614 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:00 compute-0 nova_compute[187223]: 2025-11-28 18:01:00.947 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:00 compute-0 nova_compute[187223]: 2025-11-28 18:01:00.982 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:01 compute-0 podman[219525]: 2025-11-28 18:01:01.190928184 +0000 UTC m=+0.051747525 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 18:01:01 compute-0 CROND[219550]: (root) CMD (run-parts /etc/cron.hourly)
Nov 28 18:01:01 compute-0 run-parts[219553]: (/etc/cron.hourly) starting 0anacron
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: ERROR   18:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: ERROR   18:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: ERROR   18:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: ERROR   18:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: ERROR   18:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:01:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:01:01 compute-0 run-parts[219559]: (/etc/cron.hourly) finished 0anacron
Nov 28 18:01:01 compute-0 CROND[219549]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 28 18:01:01 compute-0 nova_compute[187223]: 2025-11-28 18:01:01.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:01 compute-0 nova_compute[187223]: 2025-11-28 18:01:01.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:05 compute-0 nova_compute[187223]: 2025-11-28 18:01:05.616 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:05 compute-0 nova_compute[187223]: 2025-11-28 18:01:05.948 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:09 compute-0 ovn_controller[95574]: 2025-11-28T18:01:09Z|00217|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Nov 28 18:01:10 compute-0 podman[219560]: 2025-11-28 18:01:10.197618029 +0000 UTC m=+0.053023123 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 18:01:10 compute-0 nova_compute[187223]: 2025-11-28 18:01:10.621 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:10 compute-0 nova_compute[187223]: 2025-11-28 18:01:10.950 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:12.815 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:01:12 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:12.816 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 18:01:12 compute-0 nova_compute[187223]: 2025-11-28 18:01:12.817 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:15 compute-0 nova_compute[187223]: 2025-11-28 18:01:15.622 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:15 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:15.818 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:01:15 compute-0 nova_compute[187223]: 2025-11-28 18:01:15.952 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:18 compute-0 podman[219581]: 2025-11-28 18:01:18.192687219 +0000 UTC m=+0.055783556 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 18:01:18 compute-0 podman[219582]: 2025-11-28 18:01:18.271149434 +0000 UTC m=+0.117178779 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 28 18:01:20 compute-0 podman[219625]: 2025-11-28 18:01:20.201288938 +0000 UTC m=+0.063536159 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 18:01:20 compute-0 nova_compute[187223]: 2025-11-28 18:01:20.624 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:20 compute-0 nova_compute[187223]: 2025-11-28 18:01:20.954 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:25 compute-0 nova_compute[187223]: 2025-11-28 18:01:25.626 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:25 compute-0 nova_compute[187223]: 2025-11-28 18:01:25.956 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:27.714 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:01:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:27.714 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:01:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:01:27.715 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:01:29 compute-0 podman[197556]: time="2025-11-28T18:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:01:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:01:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Nov 28 18:01:30 compute-0 nova_compute[187223]: 2025-11-28 18:01:30.628 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:30 compute-0 nova_compute[187223]: 2025-11-28 18:01:30.958 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: ERROR   18:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: ERROR   18:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: ERROR   18:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: ERROR   18:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: ERROR   18:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:01:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:01:32 compute-0 podman[219647]: 2025-11-28 18:01:32.198744744 +0000 UTC m=+0.053915531 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 18:01:35 compute-0 nova_compute[187223]: 2025-11-28 18:01:35.630 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:35 compute-0 nova_compute[187223]: 2025-11-28 18:01:35.960 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:40 compute-0 nova_compute[187223]: 2025-11-28 18:01:40.631 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:40 compute-0 nova_compute[187223]: 2025-11-28 18:01:40.961 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:41 compute-0 podman[219671]: 2025-11-28 18:01:41.210599755 +0000 UTC m=+0.060665223 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 18:01:45 compute-0 ovn_controller[95574]: 2025-11-28T18:01:45Z|00218|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Nov 28 18:01:45 compute-0 nova_compute[187223]: 2025-11-28 18:01:45.633 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:45 compute-0 nova_compute[187223]: 2025-11-28 18:01:45.963 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:49 compute-0 podman[219694]: 2025-11-28 18:01:49.228570262 +0000 UTC m=+0.081546500 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 28 18:01:49 compute-0 podman[219695]: 2025-11-28 18:01:49.267103369 +0000 UTC m=+0.125533721 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 18:01:49 compute-0 nova_compute[187223]: 2025-11-28 18:01:49.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:49 compute-0 nova_compute[187223]: 2025-11-28 18:01:49.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:49 compute-0 nova_compute[187223]: 2025-11-28 18:01:49.684 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 18:01:50 compute-0 nova_compute[187223]: 2025-11-28 18:01:50.636 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:50 compute-0 nova_compute[187223]: 2025-11-28 18:01:50.965 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:51 compute-0 podman[219740]: 2025-11-28 18:01:51.217622716 +0000 UTC m=+0.072394285 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 28 18:01:52 compute-0 nova_compute[187223]: 2025-11-28 18:01:52.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:52 compute-0 nova_compute[187223]: 2025-11-28 18:01:52.686 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:55 compute-0 nova_compute[187223]: 2025-11-28 18:01:55.638 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:55 compute-0 nova_compute[187223]: 2025-11-28 18:01:55.967 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:01:58 compute-0 nova_compute[187223]: 2025-11-28 18:01:58.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:01:58 compute-0 nova_compute[187223]: 2025-11-28 18:01:58.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 18:01:58 compute-0 nova_compute[187223]: 2025-11-28 18:01:58.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 18:01:58 compute-0 nova_compute[187223]: 2025-11-28 18:01:58.701 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 18:01:59 compute-0 podman[197556]: time="2025-11-28T18:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:01:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:01:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.641 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.754 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.755 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.755 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.755 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.956 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.958 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5884MB free_disk=73.34137725830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.958 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.958 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:00 compute-0 nova_compute[187223]: 2025-11-28 18:02:00.969 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.069 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.070 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.093 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing inventories for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.126 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating ProviderTree inventory for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.127 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Updating inventory in ProviderTree for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.148 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing aggregate associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.169 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Refreshing trait associations for resource provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE,HW_CPU_X86_SSSE3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.201 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.216 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.217 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:02:01 compute-0 nova_compute[187223]: 2025-11-28 18:02:01.217 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: ERROR   18:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: ERROR   18:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: ERROR   18:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: ERROR   18:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: ERROR   18:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:02:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:02:03 compute-0 nova_compute[187223]: 2025-11-28 18:02:03.218 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:03 compute-0 nova_compute[187223]: 2025-11-28 18:02:03.218 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:03 compute-0 podman[219762]: 2025-11-28 18:02:03.248764142 +0000 UTC m=+0.100615622 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 18:02:03 compute-0 nova_compute[187223]: 2025-11-28 18:02:03.678 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:05 compute-0 nova_compute[187223]: 2025-11-28 18:02:05.642 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:05 compute-0 nova_compute[187223]: 2025-11-28 18:02:05.971 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:06.027 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:02:06 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:06.028 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 18:02:06 compute-0 nova_compute[187223]: 2025-11-28 18:02:06.028 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:07 compute-0 nova_compute[187223]: 2025-11-28 18:02:07.892 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:08 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:08.030 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:10 compute-0 nova_compute[187223]: 2025-11-28 18:02:10.646 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:10 compute-0 nova_compute[187223]: 2025-11-28 18:02:10.973 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:12 compute-0 podman[219786]: 2025-11-28 18:02:12.209063944 +0000 UTC m=+0.062490558 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 18:02:15 compute-0 nova_compute[187223]: 2025-11-28 18:02:15.650 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:15 compute-0 nova_compute[187223]: 2025-11-28 18:02:15.975 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.685 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.686 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.687 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.687 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.688 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.688 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.711 187227 DEBUG nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.712 187227 WARNING nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.712 187227 INFO nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Removable base files: /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.712 187227 INFO nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.713 187227 DEBUG nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.713 187227 DEBUG nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Nov 28 18:02:17 compute-0 nova_compute[187223]: 2025-11-28 18:02:17.713 187227 DEBUG nova.virt.libvirt.imagecache [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Nov 28 18:02:20 compute-0 podman[219806]: 2025-11-28 18:02:20.214433343 +0000 UTC m=+0.079889830 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 18:02:20 compute-0 podman[219807]: 2025-11-28 18:02:20.243183687 +0000 UTC m=+0.105549411 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 18:02:20 compute-0 nova_compute[187223]: 2025-11-28 18:02:20.652 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:20 compute-0 nova_compute[187223]: 2025-11-28 18:02:20.978 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:22 compute-0 podman[219853]: 2025-11-28 18:02:22.210411594 +0000 UTC m=+0.077957552 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 28 18:02:25 compute-0 nova_compute[187223]: 2025-11-28 18:02:25.655 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:25 compute-0 nova_compute[187223]: 2025-11-28 18:02:25.981 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:27.715 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:27.716 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:27.716 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:29 compute-0 podman[197556]: time="2025-11-28T18:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:02:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:02:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 28 18:02:30 compute-0 nova_compute[187223]: 2025-11-28 18:02:30.658 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:30 compute-0 nova_compute[187223]: 2025-11-28 18:02:30.984 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: ERROR   18:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: ERROR   18:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: ERROR   18:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: ERROR   18:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: ERROR   18:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:02:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:02:34 compute-0 podman[219876]: 2025-11-28 18:02:34.225395947 +0000 UTC m=+0.078948212 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 18:02:35 compute-0 nova_compute[187223]: 2025-11-28 18:02:35.659 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:35 compute-0 nova_compute[187223]: 2025-11-28 18:02:35.985 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:40 compute-0 nova_compute[187223]: 2025-11-28 18:02:40.661 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:40 compute-0 ovn_controller[95574]: 2025-11-28T18:02:40Z|00219|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 28 18:02:40 compute-0 nova_compute[187223]: 2025-11-28 18:02:40.987 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:43 compute-0 podman[219900]: 2025-11-28 18:02:43.189115128 +0000 UTC m=+0.054356634 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 18:02:43 compute-0 sshd-session[219920]: Invalid user sol from 193.32.162.146 port 34764
Nov 28 18:02:43 compute-0 sshd-session[219920]: Connection closed by invalid user sol 193.32.162.146 port 34764 [preauth]
Nov 28 18:02:45 compute-0 nova_compute[187223]: 2025-11-28 18:02:45.662 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:45 compute-0 nova_compute[187223]: 2025-11-28 18:02:45.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:45 compute-0 nova_compute[187223]: 2025-11-28 18:02:45.683 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 28 18:02:45 compute-0 nova_compute[187223]: 2025-11-28 18:02:45.697 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 28 18:02:45 compute-0 nova_compute[187223]: 2025-11-28 18:02:45.988 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.108 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.109 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.131 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.204 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.205 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.213 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.213 187227 INFO nova.compute.claims [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Claim successful on node compute-0.ctlplane.example.com
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.318 187227 DEBUG nova.compute.provider_tree [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.415 187227 DEBUG nova.scheduler.client.report [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.569 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.570 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.636 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.637 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.656 187227 INFO nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.663 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.672 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.698 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.698 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.698 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.746 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.747 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.747 187227 INFO nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Creating image(s)
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.748 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.748 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.749 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:50 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 18:02:50 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.762 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.832 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.833 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.834 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.851 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.936 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.937 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.972 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba,backing_fmt=raw /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.973 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "bd6565e0cc32420aac21dd3de8842bc53bb13eba" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.973 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:50 compute-0 nova_compute[187223]: 2025-11-28 18:02:50.990 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.028 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bd6565e0cc32420aac21dd3de8842bc53bb13eba --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.029 187227 DEBUG nova.virt.disk.api [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Checking if we can resize image /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.029 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.084 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.085 187227 DEBUG nova.virt.disk.api [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Cannot resize image /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.086 187227 DEBUG nova.objects.instance [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lazy-loading 'migration_context' on Instance uuid b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.105 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.106 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Ensure instance console log exists: /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.106 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.107 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.107 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:51 compute-0 podman[219939]: 2025-11-28 18:02:51.203135431 +0000 UTC m=+0.061681173 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 18:02:51 compute-0 podman[219940]: 2025-11-28 18:02:51.247992018 +0000 UTC m=+0.098029555 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.595 187227 DEBUG nova.policy [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17c1adf6f47747cb879184a3da9c1d22', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '880f84e836504513b156c4ba7b7d1dc4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 28 18:02:51 compute-0 nova_compute[187223]: 2025-11-28 18:02:51.679 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:52 compute-0 nova_compute[187223]: 2025-11-28 18:02:52.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:53 compute-0 nova_compute[187223]: 2025-11-28 18:02:53.176 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Successfully created port: e2e11670-b4f0-49f3-ab8b-779b9ac9caab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 28 18:02:53 compute-0 podman[219987]: 2025-11-28 18:02:53.233429562 +0000 UTC m=+0.079121587 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal)
Nov 28 18:02:53 compute-0 nova_compute[187223]: 2025-11-28 18:02:53.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.646 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Successfully updated port: e2e11670-b4f0-49f3-ab8b-779b9ac9caab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.702 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.702 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquired lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.703 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.802 187227 DEBUG nova.compute.manager [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-changed-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.802 187227 DEBUG nova.compute.manager [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Refreshing instance network info cache due to event network-changed-e2e11670-b4f0-49f3-ab8b-779b9ac9caab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 18:02:54 compute-0 nova_compute[187223]: 2025-11-28 18:02:54.802 187227 DEBUG oslo_concurrency.lockutils [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 18:02:55 compute-0 nova_compute[187223]: 2025-11-28 18:02:55.545 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 28 18:02:55 compute-0 nova_compute[187223]: 2025-11-28 18:02:55.665 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:55 compute-0 nova_compute[187223]: 2025-11-28 18:02:55.992 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.741 187227 DEBUG nova.network.neutron [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updating instance_info_cache with network_info: [{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.761 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Releasing lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.762 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Instance network_info: |[{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.762 187227 DEBUG oslo_concurrency.lockutils [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.763 187227 DEBUG nova.network.neutron [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Refreshing network info cache for port e2e11670-b4f0-49f3-ab8b-779b9ac9caab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.767 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Start _get_guest_xml network_info=[{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'guest_format': None, 'encryption_options': None, 'image_id': 'e66bcfff-a835-4b6a-9892-490d158c356a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.773 187227 WARNING nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.779 187227 DEBUG nova.virt.libvirt.host [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.779 187227 DEBUG nova.virt.libvirt.host [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.786 187227 DEBUG nova.virt.libvirt.host [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.787 187227 DEBUG nova.virt.libvirt.host [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.788 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.788 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T17:27:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f44bded-bdbe-4623-9c87-afc5919e8381',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T17:27:07Z,direct_url=<?>,disk_format='qcow2',id=e66bcfff-a835-4b6a-9892-490d158c356a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ca3b936304c947f98c618f4bf25dee0e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T17:27:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.789 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.789 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.789 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.790 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.790 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.790 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.790 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.791 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.791 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.791 187227 DEBUG nova.virt.hardware [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.796 187227 DEBUG nova.virt.libvirt.vif [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T18:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1100562807',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1100562807',id=29,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='880f84e836504513b156c4ba7b7d1dc4',ramdisk_id='',reservation_id='r-eydq3mzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2088588059',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2088588059-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T18:02:50Z,user_data=None,user_id='17c1adf6f47747cb879184a3da9c1d22',uuid=b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.796 187227 DEBUG nova.network.os_vif_util [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Converting VIF {"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.797 187227 DEBUG nova.network.os_vif_util [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.798 187227 DEBUG nova.objects.instance [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.825 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] End _get_guest_xml xml=<domain type="kvm">
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <uuid>b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1</uuid>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <name>instance-0000001d</name>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <memory>131072</memory>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <vcpu>1</vcpu>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <metadata>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1100562807</nova:name>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:creationTime>2025-11-28 18:02:56</nova:creationTime>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:flavor name="m1.nano">
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:memory>128</nova:memory>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:disk>1</nova:disk>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:swap>0</nova:swap>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:ephemeral>0</nova:ephemeral>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:vcpus>1</nova:vcpus>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       </nova:flavor>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:owner>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:user uuid="17c1adf6f47747cb879184a3da9c1d22">tempest-TestExecuteZoneMigrationStrategy-2088588059-project-member</nova:user>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:project uuid="880f84e836504513b156c4ba7b7d1dc4">tempest-TestExecuteZoneMigrationStrategy-2088588059</nova:project>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       </nova:owner>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:root type="image" uuid="e66bcfff-a835-4b6a-9892-490d158c356a"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <nova:ports>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         <nova:port uuid="e2e11670-b4f0-49f3-ab8b-779b9ac9caab">
Nov 28 18:02:56 compute-0 nova_compute[187223]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:         </nova:port>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       </nova:ports>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </nova:instance>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </metadata>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <sysinfo type="smbios">
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <system>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="manufacturer">RDO</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="product">OpenStack Compute</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="serial">b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="uuid">b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <entry name="family">Virtual Machine</entry>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </system>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </sysinfo>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <os>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <boot dev="hd"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <smbios mode="sysinfo"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </os>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <features>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <acpi/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <apic/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <vmcoreinfo/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </features>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <clock offset="utc">
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <timer name="pit" tickpolicy="delay"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <timer name="hpet" present="no"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </clock>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <cpu mode="custom" match="exact">
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <model>Nehalem</model>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <topology sockets="1" cores="1" threads="1"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </cpu>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   <devices>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <disk type="file" device="disk">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <target dev="vda" bus="virtio"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </disk>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <disk type="file" device="cdrom">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <driver name="qemu" type="raw" cache="none"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <source file="/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.config"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <target dev="sda" bus="sata"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </disk>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <interface type="ethernet">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <mac address="fa:16:3e:55:a1:8c"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <driver name="vhost" rx_queue_size="512"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <mtu size="1442"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <target dev="tape2e11670-b4"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </interface>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <serial type="pty">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <log file="/var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/console.log" append="off"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </serial>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <video>
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <model type="virtio"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </video>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <input type="tablet" bus="usb"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <rng model="virtio">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <backend model="random">/dev/urandom</backend>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </rng>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="pci" model="pcie-root-port"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <controller type="usb" index="0"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     <memballoon model="virtio">
Nov 28 18:02:56 compute-0 nova_compute[187223]:       <stats period="10"/>
Nov 28 18:02:56 compute-0 nova_compute[187223]:     </memballoon>
Nov 28 18:02:56 compute-0 nova_compute[187223]:   </devices>
Nov 28 18:02:56 compute-0 nova_compute[187223]: </domain>
Nov 28 18:02:56 compute-0 nova_compute[187223]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.826 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Preparing to wait for external event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.826 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.826 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.827 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.827 187227 DEBUG nova.virt.libvirt.vif [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T18:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1100562807',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1100562807',id=29,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='880f84e836504513b156c4ba7b7d1dc4',ramdisk_id='',reservation_id='r-eydq3mzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2088588059',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2088588059-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T18:02:50Z,user_data=None,user_id='17c1adf6f47747cb879184a3da9c1d22',uuid=b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.827 187227 DEBUG nova.network.os_vif_util [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Converting VIF {"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.828 187227 DEBUG nova.network.os_vif_util [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.828 187227 DEBUG os_vif [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.829 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.829 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.829 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.831 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.831 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape2e11670-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.832 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape2e11670-b4, col_values=(('external_ids', {'iface-id': 'e2e11670-b4f0-49f3-ab8b-779b9ac9caab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:a1:8c', 'vm-uuid': 'b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.833 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:56 compute-0 NetworkManager[55763]: <info>  [1764352976.8347] manager: (tape2e11670-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.835 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.839 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.840 187227 INFO os_vif [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4')
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.899 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.900 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.900 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] No VIF found with MAC fa:16:3e:55:a1:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 28 18:02:56 compute-0 nova_compute[187223]: 2025-11-28 18:02:56.900 187227 INFO nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Using config drive
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.319 187227 INFO nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Creating config drive at /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.config
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.330 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkae_dx_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.465 187227 DEBUG oslo_concurrency.processutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkae_dx_" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:02:57 compute-0 kernel: tape2e11670-b4: entered promiscuous mode
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.5340] manager: (tape2e11670-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Nov 28 18:02:57 compute-0 ovn_controller[95574]: 2025-11-28T18:02:57Z|00220|binding|INFO|Claiming lport e2e11670-b4f0-49f3-ab8b-779b9ac9caab for this chassis.
Nov 28 18:02:57 compute-0 ovn_controller[95574]: 2025-11-28T18:02:57Z|00221|binding|INFO|e2e11670-b4f0-49f3-ab8b-779b9ac9caab: Claiming fa:16:3e:55:a1:8c 10.100.0.8
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.534 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.537 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.542 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.547 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.564 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:a1:8c 10.100.0.8'], port_security=['fa:16:3e:55:a1:8c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '880f84e836504513b156c4ba7b7d1dc4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '763f9899-cf0f-4d7a-aeb7-504f822cd750', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e721e4f-1c56-4181-958c-fefbf24004df, chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=e2e11670-b4f0-49f3-ab8b-779b9ac9caab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.565 104433 INFO neutron.agent.ovn.metadata.agent [-] Port e2e11670-b4f0-49f3-ab8b-779b9ac9caab in datapath 73ece4ba-618d-48d3-8f69-d9fe38606d51 bound to our chassis
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.566 104433 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 73ece4ba-618d-48d3-8f69-d9fe38606d51
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.579 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2a84e3-29fa-4b28-9507-46f63264286f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.580 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap73ece4ba-61 in ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 28 18:02:57 compute-0 systemd-udevd[220029]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 18:02:57 compute-0 systemd-machined[153517]: New machine qemu-21-instance-0000001d.
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.583 208826 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap73ece4ba-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.584 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b343d91c-1b89-4250-9194-ac8959483e0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.585 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe250e4-ad15-4faa-8cbc-f9e5c553ad26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.5936] device (tape2e11670-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.5943] device (tape2e11670-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.601 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[aacd00f5-04d9-4073-b7c3-b1fee0a72244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_controller[95574]: 2025-11-28T18:02:57Z|00222|binding|INFO|Setting lport e2e11670-b4f0-49f3-ab8b-779b9ac9caab ovn-installed in OVS
Nov 28 18:02:57 compute-0 ovn_controller[95574]: 2025-11-28T18:02:57Z|00223|binding|INFO|Setting lport e2e11670-b4f0-49f3-ab8b-779b9ac9caab up in Southbound
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.610 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001d.
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.627 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[fe731994-50bd-4fb5-ac11-4151bbcf890d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.654 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[07ae4e13-e8a2-4f99-9920-8f58f53abc99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.6608] manager: (tap73ece4ba-60): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.661 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b3902a4b-e4af-48b2-aa28-c76960947d9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.697 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[42effaf7-e7b8-4ddf-ba33-9cd4bb8f78eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.702 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[b5309eb4-1257-496a-9258-229fc188d497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.7260] device (tap73ece4ba-60): carrier: link connected
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.734 208840 DEBUG oslo.privsep.daemon [-] privsep: reply[df0e0fe4-ae00-4ef0-91d9-96f0f2ba02f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.754 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6155b99a-cde1-4b67-80b5-ac79f98acf0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73ece4ba-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:1d:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615235, 'reachable_time': 43408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220061, 'error': None, 'target': 'ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.767 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[702f1637-a699-429e-a3ef-62cd6cd0aa91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:1d22'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 615235, 'tstamp': 615235}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220062, 'error': None, 'target': 'ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.786 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2d610709-e9bc-4b18-8cb6-eb687da5028e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap73ece4ba-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:1d:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615235, 'reachable_time': 43408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220063, 'error': None, 'target': 'ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.814 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[d126c716-e35e-4dae-bc33-f8c5b8afd793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.865 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[1eace7d6-1819-44ed-b387-b939f319a8b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.866 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73ece4ba-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.867 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.867 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73ece4ba-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:57 compute-0 NetworkManager[55763]: <info>  [1764352977.9101] manager: (tap73ece4ba-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 28 18:02:57 compute-0 kernel: tap73ece4ba-60: entered promiscuous mode
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.912 187227 DEBUG nova.compute.manager [req-c858e77e-fc00-401b-ab82-6ac44c228ee9 req-c1797c51-9b07-4b2d-ae40-de46459d813d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.912 187227 DEBUG oslo_concurrency.lockutils [req-c858e77e-fc00-401b-ab82-6ac44c228ee9 req-c1797c51-9b07-4b2d-ae40-de46459d813d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.912 187227 DEBUG oslo_concurrency.lockutils [req-c858e77e-fc00-401b-ab82-6ac44c228ee9 req-c1797c51-9b07-4b2d-ae40-de46459d813d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.913 187227 DEBUG oslo_concurrency.lockutils [req-c858e77e-fc00-401b-ab82-6ac44c228ee9 req-c1797c51-9b07-4b2d-ae40-de46459d813d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.913 187227 DEBUG nova.compute.manager [req-c858e77e-fc00-401b-ab82-6ac44c228ee9 req-c1797c51-9b07-4b2d-ae40-de46459d813d 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Processing event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.913 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.915 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap73ece4ba-60, col_values=(('external_ids', {'iface-id': 'e195ad68-ae72-4db1-8d09-4e10a1ca1315'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.916 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 ovn_controller[95574]: 2025-11-28T18:02:57Z|00224|binding|INFO|Releasing lport e195ad68-ae72-4db1-8d09-4e10a1ca1315 from this chassis (sb_readonly=0)
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.917 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.918 104433 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/73ece4ba-618d-48d3-8f69-d9fe38606d51.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/73ece4ba-618d-48d3-8f69-d9fe38606d51.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.918 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b1b42e-ee20-446f-8df6-36ff39f87083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.919 104433 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: global
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     log         /dev/log local0 debug
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     log-tag     haproxy-metadata-proxy-73ece4ba-618d-48d3-8f69-d9fe38606d51
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     user        root
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     group       root
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     maxconn     1024
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     pidfile     /var/lib/neutron/external/pids/73ece4ba-618d-48d3-8f69-d9fe38606d51.pid.haproxy
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     daemon
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: defaults
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     log global
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     mode http
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     option httplog
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     option dontlognull
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     option http-server-close
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     option forwardfor
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     retries                 3
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     timeout http-request    30s
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     timeout connect         30s
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     timeout client          32s
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     timeout server          32s
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     timeout http-keep-alive 30s
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: listen listener
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     bind 169.254.169.254:80
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     server metadata /var/lib/neutron/metadata_proxy
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:     http-request add-header X-OVN-Network-ID 73ece4ba-618d-48d3-8f69-d9fe38606d51
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 28 18:02:57 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:02:57.920 104433 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'env', 'PROCESS_TAG=haproxy-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/73ece4ba-618d-48d3-8f69-d9fe38606d51.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 28 18:02:57 compute-0 nova_compute[187223]: 2025-11-28 18:02:57.930 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.187 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352978.1867971, b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.187 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] VM Started (Lifecycle Event)
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.189 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.193 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.196 187227 DEBUG nova.network.neutron [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updated VIF entry in instance network info cache for port e2e11670-b4f0-49f3-ab8b-779b9ac9caab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.196 187227 DEBUG nova.network.neutron [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updating instance_info_cache with network_info: [{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.197 187227 INFO nova.virt.libvirt.driver [-] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Instance spawned successfully.
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.198 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.216 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.219 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.223 187227 DEBUG oslo_concurrency.lockutils [req-97b5e366-a52b-4a25-b189-cfacc51d4fc2 req-0c99cb21-ecb5-42ac-b231-552e5ffe5bc3 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.227 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.227 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.228 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.228 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.228 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.229 187227 DEBUG nova.virt.libvirt.driver [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.250 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.250 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352978.1870313, b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.251 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] VM Paused (Lifecycle Event)
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.283 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.286 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352978.1920295, b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.286 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] VM Resumed (Lifecycle Event)
Nov 28 18:02:58 compute-0 podman[220102]: 2025-11-28 18:02:58.29727994 +0000 UTC m=+0.049972001 container create a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.301 187227 INFO nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Took 7.56 seconds to spawn the instance on the hypervisor.
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.302 187227 DEBUG nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.319 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.321 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 18:02:58 compute-0 systemd[1]: Started libpod-conmon-a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2.scope.
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.350 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 28 18:02:58 compute-0 systemd[1]: Started libcrun container.
Nov 28 18:02:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e2ea5f779acc87dc5e9bb3786073668cf9872d7ad520e8469b62d34fdb9ee0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 18:02:58 compute-0 podman[220102]: 2025-11-28 18:02:58.268794425 +0000 UTC m=+0.021486576 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 18:02:58 compute-0 podman[220102]: 2025-11-28 18:02:58.37787718 +0000 UTC m=+0.130569261 container init a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.378 187227 INFO nova.compute.manager [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Took 8.20 seconds to build instance.
Nov 28 18:02:58 compute-0 podman[220102]: 2025-11-28 18:02:58.383996594 +0000 UTC m=+0.136688655 container start a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 18:02:58 compute-0 nova_compute[187223]: 2025-11-28 18:02:58.395 187227 DEBUG oslo_concurrency.lockutils [None req-6c1b8e72-cdd8-4e5f-ba5c-42d8885e7e05 17c1adf6f47747cb879184a3da9c1d22 880f84e836504513b156c4ba7b7d1dc4 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:02:58 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [NOTICE]   (220121) : New worker (220123) forked
Nov 28 18:02:58 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [NOTICE]   (220121) : Loading success.
Nov 28 18:02:59 compute-0 podman[197556]: time="2025-11-28T18:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:02:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Nov 28 18:02:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.021 187227 DEBUG nova.compute.manager [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.022 187227 DEBUG oslo_concurrency.lockutils [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.023 187227 DEBUG oslo_concurrency.lockutils [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.023 187227 DEBUG oslo_concurrency.lockutils [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.024 187227 DEBUG nova.compute.manager [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.024 187227 WARNING nova.compute.manager [req-125b5374-a806-4b6b-ad6e-c058eecfb2d5 req-88b4d474-e6e4-461b-b563-7d0d8d0e4782 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state None.
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.671 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.704 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.705 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 18:03:00 compute-0 nova_compute[187223]: 2025-11-28 18:03:00.705 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: ERROR   18:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: ERROR   18:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: ERROR   18:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: ERROR   18:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: ERROR   18:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:03:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:03:01 compute-0 nova_compute[187223]: 2025-11-28 18:03:01.552 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 18:03:01 compute-0 nova_compute[187223]: 2025-11-28 18:03:01.553 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquired lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 18:03:01 compute-0 nova_compute[187223]: 2025-11-28 18:03:01.553 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 28 18:03:01 compute-0 nova_compute[187223]: 2025-11-28 18:03:01.554 187227 DEBUG nova.objects.instance [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 18:03:01 compute-0 nova_compute[187223]: 2025-11-28 18:03:01.835 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.902 187227 DEBUG nova.network.neutron [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updating instance_info_cache with network_info: [{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.929 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Releasing lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.930 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.931 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.931 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.932 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.961 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.962 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.962 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:03 compute-0 nova_compute[187223]: 2025-11-28 18:03:03.963 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.042 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.107 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.108 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.164 187227 DEBUG oslo_concurrency.processutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.310 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.312 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.34035873413086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.312 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.312 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.440 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Instance b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.441 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.442 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.514 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.537 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.554 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:03:04 compute-0 nova_compute[187223]: 2025-11-28 18:03:04.555 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:05 compute-0 podman[220139]: 2025-11-28 18:03:05.196161346 +0000 UTC m=+0.059538939 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 18:03:05 compute-0 nova_compute[187223]: 2025-11-28 18:03:05.672 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:06 compute-0 nova_compute[187223]: 2025-11-28 18:03:06.838 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:07 compute-0 nova_compute[187223]: 2025-11-28 18:03:07.529 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:09 compute-0 ovn_controller[95574]: 2025-11-28T18:03:09Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:a1:8c 10.100.0.8
Nov 28 18:03:09 compute-0 ovn_controller[95574]: 2025-11-28T18:03:09Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:a1:8c 10.100.0.8
Nov 28 18:03:10 compute-0 nova_compute[187223]: 2025-11-28 18:03:10.456 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Check if temp file /var/lib/nova/instances/tmp1dyz3l4s exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Nov 28 18:03:10 compute-0 nova_compute[187223]: 2025-11-28 18:03:10.457 187227 DEBUG nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dyz3l4s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Nov 28 18:03:10 compute-0 nova_compute[187223]: 2025-11-28 18:03:10.675 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:11 compute-0 nova_compute[187223]: 2025-11-28 18:03:11.841 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:11 compute-0 nova_compute[187223]: 2025-11-28 18:03:11.944 187227 DEBUG oslo_concurrency.processutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:03:12 compute-0 nova_compute[187223]: 2025-11-28 18:03:12.010 187227 DEBUG oslo_concurrency.processutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:03:12 compute-0 nova_compute[187223]: 2025-11-28 18:03:12.011 187227 DEBUG oslo_concurrency.processutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 28 18:03:12 compute-0 nova_compute[187223]: 2025-11-28 18:03:12.068 187227 DEBUG oslo_concurrency.processutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 28 18:03:14 compute-0 podman[220189]: 2025-11-28 18:03:14.228530613 +0000 UTC m=+0.076718335 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 18:03:14 compute-0 sshd-session[220190]: Accepted publickey for nova from 192.168.122.101 port 58100 ssh2: ECDSA SHA256:MQeyK0000lOAPc/y+l43YtWHgJLzao0oTzY3RbV6Jns
Nov 28 18:03:14 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 18:03:14 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 18:03:14 compute-0 systemd-logind[788]: New session 45 of user nova.
Nov 28 18:03:14 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 18:03:14 compute-0 systemd[1]: Starting User Manager for UID 42436...
Nov 28 18:03:14 compute-0 systemd[220214]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 18:03:14 compute-0 systemd[220214]: Queued start job for default target Main User Target.
Nov 28 18:03:14 compute-0 systemd[220214]: Created slice User Application Slice.
Nov 28 18:03:14 compute-0 systemd[220214]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 18:03:14 compute-0 systemd[220214]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 18:03:14 compute-0 systemd[220214]: Reached target Paths.
Nov 28 18:03:14 compute-0 systemd[220214]: Reached target Timers.
Nov 28 18:03:14 compute-0 systemd[220214]: Starting D-Bus User Message Bus Socket...
Nov 28 18:03:14 compute-0 systemd[220214]: Starting Create User's Volatile Files and Directories...
Nov 28 18:03:14 compute-0 systemd[220214]: Listening on D-Bus User Message Bus Socket.
Nov 28 18:03:14 compute-0 systemd[220214]: Finished Create User's Volatile Files and Directories.
Nov 28 18:03:14 compute-0 systemd[220214]: Reached target Sockets.
Nov 28 18:03:14 compute-0 systemd[220214]: Reached target Basic System.
Nov 28 18:03:14 compute-0 systemd[220214]: Reached target Main User Target.
Nov 28 18:03:14 compute-0 systemd[220214]: Startup finished in 126ms.
Nov 28 18:03:14 compute-0 systemd[1]: Started User Manager for UID 42436.
Nov 28 18:03:14 compute-0 systemd[1]: Started Session 45 of User nova.
Nov 28 18:03:14 compute-0 sshd-session[220190]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Nov 28 18:03:14 compute-0 sshd-session[220229]: Received disconnect from 192.168.122.101 port 58100:11: disconnected by user
Nov 28 18:03:14 compute-0 sshd-session[220229]: Disconnected from user nova 192.168.122.101 port 58100
Nov 28 18:03:14 compute-0 sshd-session[220190]: pam_unix(sshd:session): session closed for user nova
Nov 28 18:03:14 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Nov 28 18:03:14 compute-0 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Nov 28 18:03:14 compute-0 systemd-logind[788]: Removed session 45.
Nov 28 18:03:14 compute-0 nova_compute[187223]: 2025-11-28 18:03:14.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:14 compute-0 nova_compute[187223]: 2025-11-28 18:03:14.686 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 28 18:03:15 compute-0 nova_compute[187223]: 2025-11-28 18:03:15.677 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.023 187227 DEBUG nova.compute.manager [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.024 187227 DEBUG oslo_concurrency.lockutils [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.025 187227 DEBUG oslo_concurrency.lockutils [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.025 187227 DEBUG oslo_concurrency.lockutils [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.026 187227 DEBUG nova.compute.manager [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.026 187227 DEBUG nova.compute.manager [req-820d3e1e-a4e6-4288-ad2f-980cdc99c73c req-e026fc8e-0fe5-40a0-baa1-0424ea3eb666 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.195 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:16.194 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:03:16 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:16.197 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.506 187227 INFO nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Took 4.44 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.507 187227 DEBUG nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.529 187227 DEBUG nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dyz3l4s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e5d78c21-f6d8-469e-98d5-5f9ecb56a48e),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.557 187227 DEBUG nova.objects.instance [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lazy-loading 'migration_context' on Instance uuid b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.558 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.560 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.560 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.577 187227 DEBUG nova.virt.libvirt.vif [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T18:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1100562807',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1100562807',id=29,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T18:02:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='880f84e836504513b156c4ba7b7d1dc4',ramdisk_id='',reservation_id='r-eydq3mzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2088588059',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2088588059-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T18:02:58Z,user_data=None,user_id='17c1adf6f47747cb879184a3da9c1d22',uuid=b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.577 187227 DEBUG nova.network.os_vif_util [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.578 187227 DEBUG nova.network.os_vif_util [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.578 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updating guest XML with vif config: <interface type="ethernet">
Nov 28 18:03:16 compute-0 nova_compute[187223]:   <mac address="fa:16:3e:55:a1:8c"/>
Nov 28 18:03:16 compute-0 nova_compute[187223]:   <model type="virtio"/>
Nov 28 18:03:16 compute-0 nova_compute[187223]:   <driver name="vhost" rx_queue_size="512"/>
Nov 28 18:03:16 compute-0 nova_compute[187223]:   <mtu size="1442"/>
Nov 28 18:03:16 compute-0 nova_compute[187223]:   <target dev="tape2e11670-b4"/>
Nov 28 18:03:16 compute-0 nova_compute[187223]: </interface>
Nov 28 18:03:16 compute-0 nova_compute[187223]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.579 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Nov 28 18:03:16 compute-0 nova_compute[187223]: 2025-11-28 18:03:16.844 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:17 compute-0 nova_compute[187223]: 2025-11-28 18:03:17.063 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:03:17 compute-0 nova_compute[187223]: 2025-11-28 18:03:17.063 187227 INFO nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Increasing downtime to 50 ms after 0 sec elapsed time
Nov 28 18:03:17 compute-0 nova_compute[187223]: 2025-11-28 18:03:17.147 187227 INFO nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Nov 28 18:03:17 compute-0 nova_compute[187223]: 2025-11-28 18:03:17.650 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:03:17 compute-0 nova_compute[187223]: 2025-11-28 18:03:17.650 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.127 187227 DEBUG nova.compute.manager [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.127 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG nova.compute.manager [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 WARNING nova.compute.manager [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state migrating.
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG nova.compute.manager [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-changed-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG nova.compute.manager [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Refreshing instance network info cache due to event network-changed-e2e11670-b4f0-49f3-ab8b-779b9ac9caab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.128 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.129 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquired lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.129 187227 DEBUG nova.network.neutron [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Refreshing network info cache for port e2e11670-b4f0-49f3-ab8b-779b9ac9caab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.154 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.155 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.659 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.660 187227 DEBUG nova.virt.libvirt.migration [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.875 187227 DEBUG nova.virt.driver [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] Emitting event <LifecycleEvent: 1764352998.8752954, b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.876 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] VM Paused (Lifecycle Event)
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.951 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:03:18 compute-0 nova_compute[187223]: 2025-11-28 18:03:18.956 187227 DEBUG nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 28 18:03:19 compute-0 kernel: tape2e11670-b4 (unregistering): left promiscuous mode
Nov 28 18:03:19 compute-0 NetworkManager[55763]: <info>  [1764352999.0592] device (tape2e11670-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 18:03:19 compute-0 ovn_controller[95574]: 2025-11-28T18:03:19Z|00225|binding|INFO|Releasing lport e2e11670-b4f0-49f3-ab8b-779b9ac9caab from this chassis (sb_readonly=0)
Nov 28 18:03:19 compute-0 ovn_controller[95574]: 2025-11-28T18:03:19Z|00226|binding|INFO|Setting lport e2e11670-b4f0-49f3-ab8b-779b9ac9caab down in Southbound
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.071 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:19 compute-0 ovn_controller[95574]: 2025-11-28T18:03:19Z|00227|binding|INFO|Removing iface tape2e11670-b4 ovn-installed in OVS
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.073 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.085 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:a1:8c 10.100.0.8'], port_security=['fa:16:3e:55:a1:8c 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '01f1e5e2-191c-41ea-9a37-abbc72987efb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '880f84e836504513b156c4ba7b7d1dc4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '763f9899-cf0f-4d7a-aeb7-504f822cd750', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e721e4f-1c56-4181-958c-fefbf24004df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>], logical_port=e2e11670-b4f0-49f3-ab8b-779b9ac9caab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fda326f86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.086 104433 INFO neutron.agent.ovn.metadata.agent [-] Port e2e11670-b4f0-49f3-ab8b-779b9ac9caab in datapath 73ece4ba-618d-48d3-8f69-d9fe38606d51 unbound from our chassis
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.088 104433 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 73ece4ba-618d-48d3-8f69-d9fe38606d51, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.090 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[707f76e4-f6b6-465b-aa44-3cd22c9d4a21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.091 104433 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51 namespace which is not needed anymore
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.105 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:19 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Nov 28 18:03:19 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Consumed 13.988s CPU time.
Nov 28 18:03:19 compute-0 systemd-machined[153517]: Machine qemu-21-instance-0000001d terminated.
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.200 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:03:19 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [NOTICE]   (220121) : haproxy version is 2.8.14-c23fe91
Nov 28 18:03:19 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [NOTICE]   (220121) : path to executable is /usr/sbin/haproxy
Nov 28 18:03:19 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [WARNING]  (220121) : Exiting Master process...
Nov 28 18:03:19 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [ALERT]    (220121) : Current worker (220123) exited with code 143 (Terminated)
Nov 28 18:03:19 compute-0 neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51[220117]: [WARNING]  (220121) : All workers exited. Exiting... (0)
Nov 28 18:03:19 compute-0 systemd[1]: libpod-a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2.scope: Deactivated successfully.
Nov 28 18:03:19 compute-0 podman[220259]: 2025-11-28 18:03:19.239182902 +0000 UTC m=+0.048004362 container died a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 18:03:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-4e2ea5f779acc87dc5e9bb3786073668cf9872d7ad520e8469b62d34fdb9ee0e-merged.mount: Deactivated successfully.
Nov 28 18:03:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2-userdata-shm.mount: Deactivated successfully.
Nov 28 18:03:19 compute-0 podman[220259]: 2025-11-28 18:03:19.276341878 +0000 UTC m=+0.085163338 container cleanup a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.290 187227 INFO nova.compute.manager [None req-f19564b9-4e5b-4d94-a24f-ccae9748396a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] During sync_power_state the instance has a pending task (migrating). Skip.
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.295 187227 DEBUG nova.virt.libvirt.guest [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.295 187227 INFO nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migration operation has completed
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.295 187227 INFO nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] _post_live_migration() is started..
Nov 28 18:03:19 compute-0 systemd[1]: libpod-conmon-a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2.scope: Deactivated successfully.
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.299 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.300 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.300 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Nov 28 18:03:19 compute-0 podman[220300]: 2025-11-28 18:03:19.337892566 +0000 UTC m=+0.038773665 container remove a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.342 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[2c63d7b0-2e50-453f-86b4-fe8d841260dd]: (4, ('Fri Nov 28 06:03:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51 (a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2)\na5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2\nFri Nov 28 06:03:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51 (a5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2)\na5d8cdb058317e4700c53d8cd4d5a9dd121e57c51209e459e8e52479509e11f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.344 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ff99b0-b26a-4e8b-bd9f-28a1ae5e8d01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.345 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73ece4ba-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.347 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:19 compute-0 kernel: tap73ece4ba-60: left promiscuous mode
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.362 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.365 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[26acb103-1a73-437a-8731-8c0337a6b8db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.385 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[51186a94-0c24-4ae7-981c-0ce2b84c93eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.386 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[49c5d8a1-ba72-45a9-851e-8b48c2daeaae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.400 208826 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3d4f94-587b-4ea0-adcc-de80310ee791]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615227, 'reachable_time': 31605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220318, 'error': None, 'target': 'ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.404 104546 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-73ece4ba-618d-48d3-8f69-d9fe38606d51 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 28 18:03:19 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:19.404 104546 DEBUG oslo.privsep.daemon [-] privsep: reply[545cfa5b-a3c7-4178-9772-806dad152531]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 28 18:03:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d73ece4ba\x2d618d\x2d48d3\x2d8f69\x2dd9fe38606d51.mount: Deactivated successfully.
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.957 187227 DEBUG nova.compute.manager [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.958 187227 DEBUG oslo_concurrency.lockutils [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.958 187227 DEBUG oslo_concurrency.lockutils [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.959 187227 DEBUG oslo_concurrency.lockutils [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.959 187227 DEBUG nova.compute.manager [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:19 compute-0 nova_compute[187223]: 2025-11-28 18:03:19.959 187227 DEBUG nova.compute.manager [req-bcfbdbfe-34f1-4ef3-a0d5-3238bdc9a609 req-374a987a-73eb-47ba-a2e7-a7bee727e4ba 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.033 187227 DEBUG nova.network.neutron [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Activated binding for port e2e11670-b4f0-49f3-ab8b-779b9ac9caab and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.034 187227 DEBUG nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.035 187227 DEBUG nova.virt.libvirt.vif [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T18:02:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1100562807',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1100562807',id=29,image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T18:02:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='880f84e836504513b156c4ba7b7d1dc4',ramdisk_id='',reservation_id='r-eydq3mzo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e66bcfff-a835-4b6a-9892-490d158c356a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2088588059',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2088588059-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T18:03:08Z,user_data=None,user_id='17c1adf6f47747cb879184a3da9c1d22',uuid=b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.035 187227 DEBUG nova.network.os_vif_util [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converting VIF {"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.037 187227 DEBUG nova.network.os_vif_util [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.038 187227 DEBUG os_vif [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.041 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.041 187227 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2e11670-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.044 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.047 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.049 187227 INFO os_vif [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:a1:8c,bridge_name='br-int',has_traffic_filtering=True,id=e2e11670-b4f0-49f3-ab8b-779b9ac9caab,network=Network(73ece4ba-618d-48d3-8f69-d9fe38606d51),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2e11670-b4')
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.050 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.050 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.050 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.050 187227 DEBUG nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.051 187227 INFO nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Deleting instance files /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1_del
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.051 187227 INFO nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Deletion of /var/lib/nova/instances/b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1_del complete
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.231 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.231 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.233 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.233 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.234 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.234 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-unplugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.235 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.235 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.236 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.236 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.236 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.237 187227 WARNING nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state migrating.
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.237 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.238 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.238 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.238 187227 DEBUG oslo_concurrency.lockutils [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.239 187227 DEBUG nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.239 187227 WARNING nova.compute.manager [req-05bd9c66-d201-434d-86a3-18f5b6f5d967 req-ace417fb-8cfc-490e-8935-c8ccc6547755 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state migrating.
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.259 187227 DEBUG nova.network.neutron [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updated VIF entry in instance network info cache for port e2e11670-b4f0-49f3-ab8b-779b9ac9caab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.259 187227 DEBUG nova.network.neutron [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Updating instance_info_cache with network_info: [{"id": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "address": "fa:16:3e:55:a1:8c", "network": {"id": "73ece4ba-618d-48d3-8f69-d9fe38606d51", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1107034145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "880f84e836504513b156c4ba7b7d1dc4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2e11670-b4", "ovs_interfaceid": "e2e11670-b4f0-49f3-ab8b-779b9ac9caab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.275 187227 DEBUG oslo_concurrency.lockutils [req-b3fc7996-3ba2-489e-bbfe-c9947a64b1c8 req-5cd29641-a351-40ba-a83b-9eb7ff33076f 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Releasing lock "refresh_cache-b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 28 18:03:20 compute-0 nova_compute[187223]: 2025-11-28 18:03:20.679 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:22 compute-0 podman[220319]: 2025-11-28 18:03:22.200002974 +0000 UTC m=+0.061141897 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Nov 28 18:03:22 compute-0 podman[220320]: 2025-11-28 18:03:22.22584957 +0000 UTC m=+0.085690934 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.301 187227 DEBUG nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.301 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.301 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.301 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 WARNING nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state migrating.
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG oslo_concurrency.lockutils [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.302 187227 DEBUG nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] No waiting events found dispatching network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 28 18:03:22 compute-0 nova_compute[187223]: 2025-11-28 18:03:22.303 187227 WARNING nova.compute.manager [req-96aa8c56-e625-47e5-99b0-0b1e8dbce48f req-415a81aa-aff5-4492-acce-db8bb26eb153 3c771b22d58a4b1aaad2adb2a183512c 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Received unexpected event network-vif-plugged-e2e11670-b4f0-49f3-ab8b-779b9ac9caab for instance with vm_state active and task_state migrating.
Nov 28 18:03:24 compute-0 podman[220365]: 2025-11-28 18:03:24.186958175 +0000 UTC m=+0.052837918 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc.)
Nov 28 18:03:24 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 18:03:24 compute-0 systemd[220214]: Activating special unit Exit the Session...
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped target Main User Target.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped target Basic System.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped target Paths.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped target Sockets.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped target Timers.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 18:03:24 compute-0 systemd[220214]: Closed D-Bus User Message Bus Socket.
Nov 28 18:03:24 compute-0 systemd[220214]: Stopped Create User's Volatile Files and Directories.
Nov 28 18:03:24 compute-0 systemd[220214]: Removed slice User Application Slice.
Nov 28 18:03:24 compute-0 systemd[220214]: Reached target Shutdown.
Nov 28 18:03:24 compute-0 systemd[220214]: Finished Exit the Session.
Nov 28 18:03:24 compute-0 systemd[220214]: Reached target Exit the Session.
Nov 28 18:03:24 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 18:03:24 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 18:03:24 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 18:03:24 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 18:03:24 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 18:03:24 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 18:03:24 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 18:03:25 compute-0 nova_compute[187223]: 2025-11-28 18:03:25.045 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:25 compute-0 nova_compute[187223]: 2025-11-28 18:03:25.681 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.194 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.195 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.195 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.216 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.217 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.217 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.218 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.377 187227 WARNING nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.378 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.34106826782227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.378 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.378 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.413 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration for instance b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.430 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.466 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Migration e5d78c21-f6d8-469e-98d5-5f9ecb56a48e is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.467 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.467 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.511 187227 DEBUG nova.compute.provider_tree [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.528 187227 DEBUG nova.scheduler.client.report [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.569 187227 DEBUG nova.compute.resource_tracker [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.570 187227 DEBUG oslo_concurrency.lockutils [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.577 187227 INFO nova.compute.manager [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.702 187227 INFO nova.scheduler.client.report [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] Deleted allocation for migration e5d78c21-f6d8-469e-98d5-5f9ecb56a48e
Nov 28 18:03:26 compute-0 nova_compute[187223]: 2025-11-28 18:03:26.702 187227 DEBUG nova.virt.libvirt.driver [None req-f3692ca3-afe9-492d-8027-0d69949d9121 a51bf481a34f4c319e0679ad0d6cc4af 2794e0df1eeb42989d3ec1359010ad95 - - default default] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Nov 28 18:03:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:27.717 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:03:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:27.718 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:03:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:03:27.718 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:03:29 compute-0 podman[197556]: time="2025-11-28T18:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:03:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:03:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 28 18:03:30 compute-0 nova_compute[187223]: 2025-11-28 18:03:30.047 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:30 compute-0 nova_compute[187223]: 2025-11-28 18:03:30.683 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: ERROR   18:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: ERROR   18:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: ERROR   18:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: ERROR   18:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: ERROR   18:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:03:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:03:31 compute-0 sshd-session[220388]: Invalid user ubuntu from 193.32.162.145 port 45452
Nov 28 18:03:31 compute-0 sshd-session[220388]: Connection closed by invalid user ubuntu 193.32.162.145 port 45452 [preauth]
Nov 28 18:03:34 compute-0 nova_compute[187223]: 2025-11-28 18:03:34.295 187227 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764352999.2941673, b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 28 18:03:34 compute-0 nova_compute[187223]: 2025-11-28 18:03:34.296 187227 INFO nova.compute.manager [-] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] VM Stopped (Lifecycle Event)
Nov 28 18:03:34 compute-0 nova_compute[187223]: 2025-11-28 18:03:34.327 187227 DEBUG nova.compute.manager [None req-32541992-93b2-4e2c-8412-c2bdc20d841a - - - - - -] [instance: b4e3e0a1-f0f8-4368-a18a-7efb12bbb8f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 28 18:03:35 compute-0 nova_compute[187223]: 2025-11-28 18:03:35.051 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:35 compute-0 nova_compute[187223]: 2025-11-28 18:03:35.685 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:36 compute-0 podman[220390]: 2025-11-28 18:03:36.225501273 +0000 UTC m=+0.073377234 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 18:03:40 compute-0 nova_compute[187223]: 2025-11-28 18:03:40.054 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:40 compute-0 nova_compute[187223]: 2025-11-28 18:03:40.685 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:42 compute-0 nova_compute[187223]: 2025-11-28 18:03:42.661 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:45 compute-0 nova_compute[187223]: 2025-11-28 18:03:45.056 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:45 compute-0 podman[220416]: 2025-11-28 18:03:45.204624971 +0000 UTC m=+0.065168609 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 18:03:45 compute-0 nova_compute[187223]: 2025-11-28 18:03:45.687 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:50 compute-0 nova_compute[187223]: 2025-11-28 18:03:50.059 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:50 compute-0 nova_compute[187223]: 2025-11-28 18:03:50.689 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:52 compute-0 nova_compute[187223]: 2025-11-28 18:03:52.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:52 compute-0 nova_compute[187223]: 2025-11-28 18:03:52.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:52 compute-0 nova_compute[187223]: 2025-11-28 18:03:52.685 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:52 compute-0 nova_compute[187223]: 2025-11-28 18:03:52.685 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 18:03:53 compute-0 podman[220439]: 2025-11-28 18:03:53.239439834 +0000 UTC m=+0.090879189 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 18:03:53 compute-0 podman[220438]: 2025-11-28 18:03:53.262986871 +0000 UTC m=+0.112180889 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 18:03:54 compute-0 nova_compute[187223]: 2025-11-28 18:03:54.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:03:55 compute-0 nova_compute[187223]: 2025-11-28 18:03:55.061 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:55 compute-0 podman[220484]: 2025-11-28 18:03:55.220719444 +0000 UTC m=+0.074041375 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=)
Nov 28 18:03:55 compute-0 nova_compute[187223]: 2025-11-28 18:03:55.728 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:03:59 compute-0 podman[197556]: time="2025-11-28T18:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:03:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:03:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 28 18:04:00 compute-0 nova_compute[187223]: 2025-11-28 18:04:00.063 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:00 compute-0 ovn_controller[95574]: 2025-11-28T18:04:00Z|00228|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 18:04:00 compute-0 nova_compute[187223]: 2025-11-28 18:04:00.729 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: ERROR   18:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: ERROR   18:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: ERROR   18:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: ERROR   18:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: ERROR   18:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:04:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.716 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.717 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.717 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.880 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.881 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5868MB free_disk=73.34106826782227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.881 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:04:01 compute-0 nova_compute[187223]: 2025-11-28 18:04:01.882 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.181 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.182 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.217 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.245 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.247 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:04:02 compute-0 nova_compute[187223]: 2025-11-28 18:04:02.247 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.247 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.248 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.248 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.280 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:03 compute-0 nova_compute[187223]: 2025-11-28 18:04:03.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:05 compute-0 nova_compute[187223]: 2025-11-28 18:04:05.066 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:05 compute-0 nova_compute[187223]: 2025-11-28 18:04:05.730 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:07 compute-0 podman[220505]: 2025-11-28 18:04:07.186740066 +0000 UTC m=+0.050546999 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 18:04:09 compute-0 nova_compute[187223]: 2025-11-28 18:04:09.753 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:10 compute-0 nova_compute[187223]: 2025-11-28 18:04:10.068 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:10 compute-0 nova_compute[187223]: 2025-11-28 18:04:10.733 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:15 compute-0 nova_compute[187223]: 2025-11-28 18:04:15.070 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:15 compute-0 nova_compute[187223]: 2025-11-28 18:04:15.735 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:16 compute-0 podman[220529]: 2025-11-28 18:04:16.221168152 +0000 UTC m=+0.083733105 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 18:04:17 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:17.268 104433 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3e:3c:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '0e:e0:60:f9:5f:25'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 28 18:04:17 compute-0 nova_compute[187223]: 2025-11-28 18:04:17.268 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:17 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:17.270 104433 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 28 18:04:18 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:18.273 104433 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2ad2dbac-a967-40fb-b69b-7c374c5f8e9d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 28 18:04:20 compute-0 nova_compute[187223]: 2025-11-28 18:04:20.072 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:20 compute-0 nova_compute[187223]: 2025-11-28 18:04:20.737 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:24 compute-0 podman[220548]: 2025-11-28 18:04:24.221661644 +0000 UTC m=+0.074139928 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 18:04:24 compute-0 podman[220549]: 2025-11-28 18:04:24.251895212 +0000 UTC m=+0.110248022 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 18:04:25 compute-0 nova_compute[187223]: 2025-11-28 18:04:25.076 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:25 compute-0 nova_compute[187223]: 2025-11-28 18:04:25.738 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:26 compute-0 podman[220592]: 2025-11-28 18:04:26.229752009 +0000 UTC m=+0.079679602 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Nov 28 18:04:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:27.719 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:04:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:27.719 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:04:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:04:27.719 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:04:29 compute-0 podman[197556]: time="2025-11-28T18:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:04:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:04:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Nov 28 18:04:30 compute-0 nova_compute[187223]: 2025-11-28 18:04:30.079 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:30 compute-0 nova_compute[187223]: 2025-11-28 18:04:30.740 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: ERROR   18:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: ERROR   18:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: ERROR   18:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: ERROR   18:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: ERROR   18:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:04:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:04:35 compute-0 nova_compute[187223]: 2025-11-28 18:04:35.081 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:35 compute-0 nova_compute[187223]: 2025-11-28 18:04:35.742 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:38 compute-0 podman[220613]: 2025-11-28 18:04:38.184567596 +0000 UTC m=+0.052609331 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 18:04:40 compute-0 nova_compute[187223]: 2025-11-28 18:04:40.083 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:40 compute-0 nova_compute[187223]: 2025-11-28 18:04:40.744 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:45 compute-0 nova_compute[187223]: 2025-11-28 18:04:45.085 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:45 compute-0 nova_compute[187223]: 2025-11-28 18:04:45.747 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:47 compute-0 podman[220637]: 2025-11-28 18:04:47.196473468 +0000 UTC m=+0.058017768 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 28 18:04:48 compute-0 ovn_controller[95574]: 2025-11-28T18:04:48Z|00229|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Nov 28 18:04:50 compute-0 nova_compute[187223]: 2025-11-28 18:04:50.087 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:50 compute-0 nova_compute[187223]: 2025-11-28 18:04:50.747 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:52 compute-0 nova_compute[187223]: 2025-11-28 18:04:52.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:53 compute-0 nova_compute[187223]: 2025-11-28 18:04:53.680 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:53 compute-0 nova_compute[187223]: 2025-11-28 18:04:53.705 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:53 compute-0 nova_compute[187223]: 2025-11-28 18:04:53.705 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:53 compute-0 nova_compute[187223]: 2025-11-28 18:04:53.705 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 28 18:04:55 compute-0 nova_compute[187223]: 2025-11-28 18:04:55.096 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:55 compute-0 podman[220657]: 2025-11-28 18:04:55.198811706 +0000 UTC m=+0.059697125 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 18:04:55 compute-0 podman[220658]: 2025-11-28 18:04:55.280302788 +0000 UTC m=+0.132771378 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 28 18:04:55 compute-0 nova_compute[187223]: 2025-11-28 18:04:55.750 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:04:56 compute-0 nova_compute[187223]: 2025-11-28 18:04:56.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:04:57 compute-0 podman[220701]: 2025-11-28 18:04:57.188506879 +0000 UTC m=+0.052317395 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 18:04:59 compute-0 podman[197556]: time="2025-11-28T18:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:04:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:04:59 compute-0 podman[197556]: @ - - [28/Nov/2025:18:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Nov 28 18:05:00 compute-0 nova_compute[187223]: 2025-11-28 18:05:00.101 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:00 compute-0 nova_compute[187223]: 2025-11-28 18:05:00.752 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: ERROR   18:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: ERROR   18:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: ERROR   18:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: ERROR   18:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: ERROR   18:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:05:01 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.683 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.684 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.711 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.712 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.712 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.901 187227 WARNING nova.virt.libvirt.driver [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.903 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5873MB free_disk=73.34106826782227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.903 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.903 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.967 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.968 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 28 18:05:03 compute-0 nova_compute[187223]: 2025-11-28 18:05:03.996 187227 DEBUG nova.compute.provider_tree [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 28 18:05:04 compute-0 nova_compute[187223]: 2025-11-28 18:05:04.012 187227 DEBUG nova.scheduler.client.report [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Inventory has not changed for provider 2d742abf-eadd-46e1-bc0a-5ea4c6acfad5 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 28 18:05:04 compute-0 nova_compute[187223]: 2025-11-28 18:05:04.014 187227 DEBUG nova.compute.resource_tracker [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 28 18:05:04 compute-0 nova_compute[187223]: 2025-11-28 18:05:04.015 187227 DEBUG oslo_concurrency.lockutils [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.016 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.016 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.016 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.104 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.216 187227 DEBUG nova.compute.manager [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.216 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.754 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:05 compute-0 nova_compute[187223]: 2025-11-28 18:05:05.879 187227 DEBUG oslo_service.periodic_task [None req-afc61433-7c6c-40f0-9d18-6709ba21caf2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 28 18:05:09 compute-0 podman[220723]: 2025-11-28 18:05:09.197470866 +0000 UTC m=+0.062948657 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 18:05:10 compute-0 nova_compute[187223]: 2025-11-28 18:05:10.107 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:10 compute-0 nova_compute[187223]: 2025-11-28 18:05:10.756 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:15 compute-0 nova_compute[187223]: 2025-11-28 18:05:15.110 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:15 compute-0 nova_compute[187223]: 2025-11-28 18:05:15.759 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:18 compute-0 podman[220747]: 2025-11-28 18:05:18.20040731 +0000 UTC m=+0.060012404 container health_status d3f47a0deab543ce7e3ba16f4834abe6225eb56eedaf2b4e462907b6d5aeb0e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 18:05:20 compute-0 nova_compute[187223]: 2025-11-28 18:05:20.112 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:20 compute-0 nova_compute[187223]: 2025-11-28 18:05:20.761 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:25 compute-0 nova_compute[187223]: 2025-11-28 18:05:25.115 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:25 compute-0 nova_compute[187223]: 2025-11-28 18:05:25.764 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:26 compute-0 podman[220766]: 2025-11-28 18:05:26.209655784 +0000 UTC m=+0.070279295 container health_status 077b040e5e4f452a96580993b5cf534d8e736fa5bc90bd7ff639f4af12cda176 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 18:05:26 compute-0 podman[220767]: 2025-11-28 18:05:26.235686193 +0000 UTC m=+0.083125329 container health_status 28c9bf48149e5692d6fe49351b831a720d3313c1c0e323350d8629163ac55adc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 18:05:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:05:27.720 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 28 18:05:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:05:27.721 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 28 18:05:27 compute-0 ovn_metadata_agent[104428]: 2025-11-28 18:05:27.721 104433 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 28 18:05:28 compute-0 podman[220813]: 2025-11-28 18:05:28.213841858 +0000 UTC m=+0.066753376 container health_status 6501116745e2bbb28d8bbd78fc1749c8e32a774b4ce8306e955baa8e1e4ed882 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Nov 28 18:05:29 compute-0 podman[197556]: time="2025-11-28T18:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 18:05:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17090 "" "Go-http-client/1.1"
Nov 28 18:05:29 compute-0 podman[197556]: @ - - [28/Nov/2025:18:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Nov 28 18:05:30 compute-0 nova_compute[187223]: 2025-11-28 18:05:30.117 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:30 compute-0 nova_compute[187223]: 2025-11-28 18:05:30.766 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: ERROR   18:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: ERROR   18:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: ERROR   18:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: ERROR   18:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: ERROR   18:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Nov 28 18:05:31 compute-0 openstack_network_exporter[199717]: 
Nov 28 18:05:32 compute-0 sshd-session[220834]: Accepted publickey for zuul from 192.168.122.10 port 60986 ssh2: ECDSA SHA256:WZF57Px3euv6N78lWIGFzFXKuQ6jzdjGOx/DgOfNYD8
Nov 28 18:05:32 compute-0 systemd-logind[788]: New session 47 of user zuul.
Nov 28 18:05:32 compute-0 systemd[1]: Started Session 47 of User zuul.
Nov 28 18:05:32 compute-0 sshd-session[220834]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 28 18:05:32 compute-0 sudo[220838]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 28 18:05:32 compute-0 sudo[220838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 28 18:05:35 compute-0 nova_compute[187223]: 2025-11-28 18:05:35.171 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:35 compute-0 nova_compute[187223]: 2025-11-28 18:05:35.768 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:38 compute-0 ovs-vsctl[221053]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 18:05:39 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220862 (sos)
Nov 28 18:05:39 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 28 18:05:39 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 28 18:05:39 compute-0 podman[221099]: 2025-11-28 18:05:39.754113434 +0000 UTC m=+0.090205081 container health_status d4a6a9a8fa51e9e2ee09f5c7b39f3538e92d9e96d2224fbe8c881c9d58ffaaaf (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 18:05:39 compute-0 virtqemud[186845]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 18:05:39 compute-0 virtqemud[186845]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 18:05:40 compute-0 virtqemud[186845]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 18:05:40 compute-0 nova_compute[187223]: 2025-11-28 18:05:40.175 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:40 compute-0 nova_compute[187223]: 2025-11-28 18:05:40.770 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:41 compute-0 crontab[221494]: (root) LIST (root)
Nov 28 18:05:43 compute-0 systemd[1]: Starting Hostname Service...
Nov 28 18:05:43 compute-0 systemd[1]: Started Hostname Service.
Nov 28 18:05:45 compute-0 nova_compute[187223]: 2025-11-28 18:05:45.179 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 28 18:05:45 compute-0 nova_compute[187223]: 2025-11-28 18:05:45.772 187227 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
